Builds, enhances and maintains an Enterprise Data Warehouse and Data Lake that serves the reporting and analytical needs for VNSNY, as well as IT consumers. Participates in Systems Delivery Life Cycle for Data Integration projects including preparing detailed technical design documents from functional requirement documents, designing and developing Extract, Transform and Load (ETL) framework, establishing DB application schemas and developing database code, participating in unit and system testing of end-to-end integration, and performing release management and production support as needed.
- Implements and maintains VNSNY's Enterprise Data Warehouse. Responsible for populating the data warehouse using extracts from platforms including but not limited to Oracle, Microsoft SQL Server, NoSQL databases, flat files, JSON files, XML files, spread sheets and API data access using REST/SOAP.
- Creates the end-to-end solutions for ETL/ELT transformational jobs that involve writing the Informatica workflows and mappings, Oracle PL SQL procedures and functions, SQLServer T-SQL procedures and Unix shell and Python scripting to load VNSNY data into Operational Staging, Warehouse and Reporting layers of the Data Lake stack from disparate source systems.
- Assists in planning and organizing the activities/duties of the area and in the development of the overall project plans and timetables. Designs and develops ETL to load the data warehouse and data marts, utilizing reusable design components. Tunes ETL code for efficiency.
- Supports the user community and Information Systems Development groups.
- Maps data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality and lineage.
- Prototypes solutions, prepares test scripts, and conducts tests for data replication, extraction, loading, cleansing, and data modeling for data warehouse.
- Maintains knowledge of software tools, languages, scripts, and shells that effectively support the data warehouse environment in different operating systems.
- Participates in special projects and performs other related duties as required.
Bachelor's degree in Computer Science, Information Systems, or a related field required.
Minimum of three years of experience with a Bachelor's degree or one year of experience with a Master's degree in one of the above majors in Data Integration projects working on ETL(or ELT) frameworks for Data Warehouse projects, preferably using Informatica 10.x, required. Experience with performance tuning techniques on ETL code as well as underlying databases required. Experience with database development for data warehouse projects including writing SQL queries, PL/SQL functions, procedures and packages in both leading open-source and commercial database platforms required. Familiarity with NoSQL databases and general Big data concepts and technologies preferred. Experience in RESTful Web Service required. Experience in Unix/Linux shell and Python scripting required. Experience with Advanced Scheduling tools such as Control-M, Autosys or Tivoli required. Working knowledge of data modeling, and understating various aspects of logical, conceptual and physical modeling of data warehouse projects required. Familiarity with cloud related technologies (Snowflake, RDS, S3, LAMDA, EC2, etc.) as related to data integration required. Working knowledge of an analytics platform/Visualization tool such as Oracle OBIEE, Microstrategy or Tableau required. Demonstrated ability to work independently, take initiative and contribute to new ideas required.
Apply on company website