This client is modern, in the public cloud (AWS), micro-services based large distributed system and is quality driven. Their stack consists of Scala, Play, Akka, MongoDB, PostgreSQL, Docker, Mesos, Marathon, Jenkins, Kafka, Spark, HDFS etc. They believe in continuously evaluating our tech stack, have an easy process of suggesting and adapting new technologies. Our engineers enjoy being empowered and accountable.
We are looking for expert Business Intelligence professionals to join our data team, which also includes engineers and scientists, to drive our analytics, reporting, and business intelligence.
Analyze data from consumers interactions with their healthcare insurers and providers, monitor trends and develop strategies and opportunities to improve their health and lower their costs.
Partner with employers and healthcare insurers, prova phenomenal great UI for covered members and their families to manage both their health and healthcare options
Develop actionable insights for population health management and positive recommendations for ways individuals can improve their health and manage their costs
Responsible for the design, development, implementation and support of critical enterprise E2E Business Intelligence ETL solutions in Hadoop, sourcing data from HDFS, Amazon Redshift, MongoDB or Postgres environments and utilizing Python, Spark and Hive.
Handle the product's or project’s conception, design initial product specifications and lead scheduling, estimating and securing of resources
Provide technical guidance to other internal and external teams
Help to train new employees and stay ahead of industry trends and issues
Maintaining business partner engagement and setting expectations
Assessing current processes and recommending changes as needed
Documenting and communicating technical specifications to ensure that proper and optimized techniques, queries, data standards, and final outputs are understood and incorporated into data and analytics processes
Participate in business analysis activities to gather required reporting and dashboard requirements
Translate business requirements into specifications that will be used to implement the required user-friendly environments, reports and dashboards, built from potentially multiple data sources
Advanced working knowledge and ability to write complex SQL and HQL queries in an HDFS environment
Extensive hands-on experience working with Python and PySpark for the purposes of data transformations and ETL
Strong familiarity with Kimball, OLAP, and EDW data design methodologies
8+ years experience in ETL, Data Engineering, or BI fields with concentration on data transformations
Understanding of various data extraction and transformation techniques with data sourced in HDFS, MongoDB, and Postgres
Working familiarity with Pentaho, Airflow, or Oozie
Knowledge of Scala is a bonus
Familiar with Data Visualization standard methodologies
Ability to succeed in a dynamic, Agile environment
Strong prioritization and time-management skills
Dedication to team goals that include support of live 24/7 production systems
A consummate collaborator, able to establish good relationships with technical, product, and business owners
A champion of quality, able to QA and vouch for the integrity of the report output