ETL Engineer with Security Clearance Engineering - Chantilly, VA at Geebo

ETL Engineer with Security Clearance

Jacobs is hiring for an ETL Engineer to join our growing team in Chantilly, VA! Successful candidate will work with a midsize agile product team and will have hands on experience with data pipeline from ingest to visualization.
The team is using cloud technologies in AWS.
This is a mission critical effort for the Customer and the Nation's Security.
The ETL Engineer is responsible for developing, sustaining, and improving the current system data ingest functions and make adjustments, as needed, by performing analysis, monitoring and configuring the solution; and creating or updating detailed design information; accessing and reviewing the code repository, administering code reviews, adjusting unit testing, maintaining build support, including continuous integration, sustain operational support activities and respond to mission critical requests; working with application development tools, standards and languages.
Responsibilities:
Build and optimize data pipelines to extract data from various sources, transform it into the required format, and load using Databricks and AWS services Implement data validation, cleansing, and enrichment techniques to improve the accuracy and completeness of dataMonitor and troubleshoot ETL processes to identify and resolve issues in a timely manner #divergent#dvstis At least 6 years experience with ETL Education:
Bachelor's degree in Engineering, Computer Science, or other related analytical, scientific, or technical discipline Clearance Requirement:
Active Top Secret, SCI eligible, with ability to obtain a POLY Database ETL engineers:
experience with Oracle 11g/12c, Sun Solaris OS, Linux (CentOS, Red Hat), and Windows environments Software/scripting engineers:
experience performing software and scripting engineering for data ingest with Java or Python for ETL Experience with web services and/or microservices Experience with ETL tools such as NiFi or Informatica Experience using software repository tools such as GIT or SVN Hands-on experience with developing ETL processes using Java or Scala in Big Data frameworks like Hadoop MapReduce, Apache NiFi or Apache Spark Development/Deployment experience with containerized components using Docker in Linux environment Experience with programmatically parsing different data formats like JSON, XML and CSV Experience with relational data sources and SQL Comfortable with working in AWS environment Strong proficiency in programming languages such as Scala, or Java Experience in designing and developing ETL workflows using tools like Apache Spark or AWS Glue In-depth knowledge of ETL best practices, data integration techniques, and data quality management Familiarity with different data storage technologies and databases, such as Amazon S3 or Amazon Redshift Recommended Skills Agile Methodology Amazon Redshift Apache Hadoop Apache Nifi Apache Spark Big Data Apply to this job.
Think you're the perfect candidate? Apply on company site $('.
external-apply-email-saved').
on('click', function (event) window.
ExternalApply = window.
open('/interstitial?jobdid=j3v7r86nrnq8v1vtqvs', 'ExternalApply-j3v7r86nrnq8v1vtqvs'); ); Estimated Salary: $20 to $28 per hour based on qualifications.

Don't Be a Victim of Fraud

  • Electronic Scams
  • Home-based jobs
  • Fake Rentals
  • Bad Buyers
  • Non-Existent Merchandise
  • Secondhand Items
  • More...

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.