Utah IT Jobs

Mobile utah department of workforce services Logo

Job Information

BioFire Diagnostics, LLC. ETL Developer I in Salt Lake City, Utah

BioFire Diagnostics, LLC. is looking to add an ETL Developer I to our team!

ETL Developer will analyze, design, develop, test and maintain the data pipeline component of BioFire’s Data Warehouse. This includes refinement of stored procedures to improve daily and real-time data loads in effort to make them more efficient and scalable.

ETL Developer will work closely with BI Analysts and Engineers to build a Data Warehouse that provides quality, accurate, accessible and governed data capable to deliver business insight to various BioFire business domains through reporting, dashboards and visualizations

ETL Developer will promote Kimball methodology and utilize current technology stack to find innovative solutions to complex data problems. Ability to work as a team player alongside a talented group of individuals for the same cause is essential to success in the role.

Principal Job Duties and Responsibilities:

  • Perform all work in compliance with company policy and within the guidelines of BioFire’s Quality System.

  • Understand BioFire’s technical and scientific mission.

  • Understand technical and functional components of BioFire Diagnostics

  • Perform role in development of BI infrastructure including design, build, management, maintenance, and optimization of a data warehouse

  • Achieve optimal target architecture by analyzing current ETL landscape and determine if pipelines are best handled through an ETL tool, stored procedures or a combination of both

  • Participate in requirements, design, and analysis sessions to ensure sound team decision-making and effective ETL strategy that aligns with BioFire company objectives

  • Adapt ETL processes to accommodate changes in source systems and new business user requirements

  • Develop, test, monitor and troubleshoot ETL processes

  • Automate the ETL process through scheduling and exception-handling routines as well as source to target mapping development, support and maintenance

  • Thoroughly document ETL process and overall Data Warehouse landscape within Data Catalog

  • Participate in collaboration efforts with global BI development teams (e.g. bioMerieux France, Cognizant development team in India)

  • Collaborate with internal data analyst/science teams to provide accurate analytics to business and a data warehouse/lake that allows for flexibility and usability

  • Where applicable, build out new and useful technologies within our Microsoft Azure environment such as Azure Databricks, Azure Machine Learning, and big data technologies such as Hive, Spark, and Kafka

  • Prioritize and complete data requests in a timely manner

  • Complete projects on time with minimal supervision.

  • Continually update technical knowledge and skills.

  • Participate in sprint planning, standup and retrospective meetings as needed

  • Domestic and international travel may be required.

Principal Decisions:

  • Recommend process improvement opportunities related to BI and the Data Warehouse

  • BI is an integral member of the data architecture team. ETL Developer will contribute meaningfully to CAB discussions and data governance discussions to ensure we have a Data Warehouse that provides quality, accurate, accessible, and governed data

Training and Education:

  • BS degree in Computer Science, IS, IT or related field required; advanced degree preferred

  • Professional BI certifications preferred

Experience:

  • 0-2 years of relevant experience required

  • Experience in BI / data integration and solution implementation projects required (BI architecture and design, solution implementation, and/or data warehouse development); MS Azure, Amazon AWS, and/or SAP Hana (SCP) preferred

Skills:

  • Previous experience in BI / data integration and implementation projects required (BI architecture and design, solution implementation, and/or data warehouse development)

  • Understands ERP and CRM systems data infrastructure and how it relates to transformation requirements (data warehouse) for analysis.

  • Expert level in SQL (MS SQL/TSQL, MySQL, and/or Amazon Redshift SQL)

  • Experience with one or more ETL Tools preferred (SAP Data Services, MS Data Factory, Oracle Data Integrator, Informatica, SSIS)

  • Experience with stored procedures

  • Experience with Data Warehouse design methodologies (Kimball)

  • Experience in BI Software (Power BI, DOMO, Tableau, etc.) preferred

  • Experience in one of the following programming/analytical languages preferred: R, Python, C#, Java

  • Experience working with data from SAP and Salesforce a plus

  • Experience with Jira preferred

  • Proficient in MS Office Suite and flow charts software

  • Drive for continuous improvement

  • Must demonstrate ability to communicate effectively and work well with team

  • Must demonstrate ability to complete objectives without high levels of supervision.

DirectEmployers