Utah IT Jobs

Utah Department of Workforce Services

Mobile utah department of workforce services Logo

Job Information

OVERSTOCK.COM, INC. Big Data Software Developer I in MIDVALE, Utah

Big Data Software Developer I (Overstock.com, Inc., Midvale, UT) Multiple openings available. Use strong communication skills (oral and written to participate in a team environment for the delivery and maintenance of data solutions. Program, debug, and test applications in a development environment. Be responsible for database design and system testing. Follow legal policies as directed. Create Extract, Transform, Load (ETL) jobs in Java, using Spark, to store data on the Hadoop Distributed File System (HDFS), use Apache Sqoop to transfer bulk data out to Teradata tables, and write Oozie Workflows and sub-workflows to manage and validate exported data. Modify existing Pig and Java applications depending on use cases, create workflow using Sqoop data to transfer bulk data to Teradata tables, and validate count through entire workflow. Create Storm topology to get the data required for the bidding system, store it in Phoenix tables, and push the manipulated data from previous stages to Kafka topic, using Storm (1.1.0) and Kafka (0.10.2.0). Rewrite existing Storm topologies to upgrade Storm and Kafka software, move it from Trident to core-storm, use tumbling window to process data in groups to better utilize memcached and Apache Phoenix connections. Develop data processing pipelines using Spark, Kafka, and StreamSets, to support variable rates of data flow. Use strong analytics knowledge to help translate business and analyst requirements into software development requirements in order to build the tools required from the business. Use a strong understanding of different data structures (including JavaScript Object Notation (JSON), Extensible Markup Language (XML), Avro, Orc, Comma-Separate Values (CSV), and Multi-Line Log4j format) to determine the pros and cons of the most common structures, the best way to handle each structure, and when to use which structure for different applications. Identify and rewrite legacy batch oriented data processing jobs to streaming real-time processing jobs, such as converting legacy Pig or Java-Python MapReduce jobs to Spark, Flink or Storm jobs as necessary. Write and maintain highly performance and scalable ingest and egress tools. Minimum Requirements: Bachelor's degree or U.S. equivalent in Computer Science, Computer Engineering, Software Engineering, Electrical Engineering, Electronic Engineering Mathematics, or related field, plus 3 years of professional experience using core technologies and methodologies (including Java, Scala, Spark, JMS, Kafka, Continuous Integration-Delivery, SOA Principals, MapReduce, Hadoop, Developing Data Products at scale, and Test Driven Development principals) to conduct the full lifecycle of computer programming (including building, maintaining, and testing of source code of computer programs.) Must also have the following: 1 year of professional experience writing applications in Java; 1 year of professional experience writing corresponding unit tests for Java applications; 1 year of professional experience programming, debugging, and testing applications using Maven and Subversion; 1 year of professional experience creating and modifying applications using Oracle, Teradata, , or Hadoop features (including Pig, MapReduce, Hive, Sqoop, and HBase). Please submit resume online at: https:--www.overstock.com-careers or via email: overstockcareers@overstock.com. Please specify ad code VPWM. EOE. MFDV.

DirectEmployers