DE Jobs

Search from over 2 Million Available Jobs, No Extra Steps, No Extra Forms, Just DirectEmployers

Job Information

GENESIS 10 Sr. Data Engineer in GREEN BAY, Wisconsin

JOB REQUIREMENTS: Genesis10 is currently seeking a Sr. Data Engineer for a contract working remotely through June 2025 with a bank in Green Bay, WI. Location: Remote. These states are allowed: Arizona, Connecticut, Florida, Iowa, Illinois, Indiana, Maine, Massachusetts, Michigan, Minnesota, Missouri, New Jersey, New York, Ohio, Pennsylvania, Rhode Island, Texas, and Wisconsin. Nevada, Nebraska, New Hampshire, South Carolina Hours: Normal business hours Compensation: \$65-85/hour Project Overview: Loan Origination System Replacement + Booking & Funding Solutions Most likely option of Moddy\'s Loan Origination Systems (LOS) Small Business Portal Integrations Doc Prep & Collateral Management Multiple key system integrations Responsibilities: Build out/expand current DRR processing that exists in Snowflake to include Commercial loans Build and deliver three one-time data migration files for CreditLens Build and deliver two batch recurring data import files for CreditLens Support the development and delivery of a standard data extract file from CreditLens to DRR only Modify Snowflake to switch from handling the current Optimist data to handling new CreditLens data Develop technical documentation to be leveraged in the Phase 2 project The Data Engineer will have a strong background in data engineering, with extensive experience in designing, building, and maintaining scalable data pipelines and architectures. As a Data Engineer, you will play a critical role in shaping our data infrastructure, ensuring the availability, reliability, and performance of our data systems. Design, develop, test, and deploy streaming and batch ingestion methods and pipelines across a variety of data domains leveraging programming languages, application integration software, messaging technologies, REST APIs and ETL/ELT tools. Ensure that high-throughput, low-latency, and fault-tolerant data pipelines are developed by applying best practices to the data mapping, code development, error handling and automation. As part of an agile team, design, develop and maintain an optimal data pipeline architecture using both structured data sources and big data for both on-premises and cloud-based environments. Develop and automate ETL/ELT code using scripting languages and ETL tools to support all reporting and analytical data needs. Following DataOps best practices, enable orchestration of data, tools, environments, and code. Design and build dimensional data models to support the data warehouse initiatives. Identify, design, and implement internal process improvements: automating manual processes, optimizing data pipeline performance, re-designing infrastructure for greater scalability and access to information. Participate in requirements gathering sessions to distill technical requirements from business requests. Collaborate with business partners... For full info follow application link. Genesis10 is an Equal Opportunity Employer/Minorities/Female/Disabled/Veteran ***** APPLICATION INSTRUCTIONS: Apply Online: ipc.us/t/2801ACEED9AE43F3

DirectEmployers