Job Number: R0084634
Data Ingest Engineer, Senior
Are you looking for an opportunity to develop a data platform that will have an impact on rapid exploitation and sharing of multi-INT information across the intelligence community? Solid platform development is a critical part of any program's success and you know how to do it right – scalable design with baked in security. That's why we need you, a developer with the skills to build a platform that will transform the integrated intelligence mission.
As a data ingest developer on our team, you'll design and develop the core data ingestion architecture and specific data source ingestion flows for our project from initial development to operational deployment. You'll work with customers and end-users to understand their mission, current architecture, and security requirements. With a focus on the customer's goals, you'll build a design that will scale to meet their evolving needs. Your technical expertise will be vital as you recommend tools and capabilities based on your research of the current environment and new technology. Your design will set the standard for future development, so you'll craft an architecture that smoothly works with existing infrastructure without compromising security. As a technical leader, you'll guide the team in integrating and optimizing multiple data flows in the system to help your customers meet their toughest challenges. This is a chance to use your deep understanding of the ingestion and transformation of Multi-INT data sources and broaden your skill set into areas like Cloud computing, large-scale data ingestion and processing, multiple data store types, including graph data, time series, spatial, and other NoSQL, high performance data streaming, data science, and supporting the integration of novel mission applications and analytics. Join us as we develop software-based solutions to make a difference for the integrated intelligence mission.
Empower change with us.
-5+ years of experience with Java or ETL engineering
-2+ years of experience with working in data ingestion, processing, and distribution
-2+ years of experience with performing ETL activities using Apache NiFi
-1+ years of experience in developing and deploying data ingestion, processing, and distribution systems on and with AWS technologies
-Experience in working with IC data sets and NoSQL databases, including ElasticSearch and HBase
-Experience with using the following AWS datastores: RDS Postgres, S3, or DynamoDB
-Experience with Agile software development practices and pub/sub messaging technology, including Apache Kafka
-Top Secret clearance required
-HS diploma or GED
-Ability to obtain Security+ CE, SSCP, CCNA-Security, or GSEC Certification within 6 months of start date
Nice If You Have:
-2+ years of experience with cognitive computing, data integration, data mining, Natural Language Processing, Hadoop platforms, or automating machine learning components
-1+ years of experience with data mining using current methods and tools
-Experience with graph data stores, time series database, and other NoSQL technologies
-Knowledge of one or more of the following: Jira, Git, Kafka, Kubernetes, Rancher, or Docker
-Knowledge of data science tools and their integration with big data stores
-Knowledge of data security policies and guidelines
-TS/SCI clearance with a polygraph preferred
- BA or BS degree preferred; MA or MS degree a plus
-Security+ CE, SSCP, CCNA-Security, or GSEC Certification
Applicants selected will be subject to a security investigation and may need to meet eligibility requirements for access to classified information; Top Secret clearance is required.
We're an EOE that empowers our people—no matter their race, color, religion, sex, gender identity, sexual orientation, national origin, disability, veteran status, or other protected characteristic—to fearlessly drive change.
Apply on company website