DE Jobs

Search from over 2 Million Available Jobs, No Extra Steps, No Extra Forms, Just DirectEmployers

Job Information

Prudential Ins Co of America Senior Data Engineer in Newark, New Jersey

Job Classification:
Technology - Engineering & Cloud
Are you interested in building capabilities that enable the organization with innovation, speed, agility, scalability and efficiency? The Global Technology team takes great pride in our culture where digital transformation is built into our DNA! When you join our organization at Prudential, youll unlock an exciting and impactful career all while growing your skills and advancing your profession at one of the worlds leading financial services institutions.

Your Team & Role:
As a Senior Data Engineer, you will partner with talented architects, infrastructure engineers, machine learning engineers, data scientists and data analysts to improve may different products and services. You will analyze, design, develop, test, and perform ongoing maintenance to build high quality data pipelines that drive platform delivery. You will implement capabilities to solve sophisticated business problems, deploy innovative products, services and experiences to delight our customers! In addition to applied experience, you will bring excellent problem solving, communication and teamwork skills, along with agile ways of working, strong business insight, an inclusive leadership attitude and a continuous learning focus to all that you do.

Here is What You Can Expect on a Typical Day:
Build and optimize data pipelines, logic and storage systems with latest coding practices and industry standards and modern design patterns and architectural principles; actively code and execute against the roadmap
Develop high quality, well documented and efficient code adhering to all applicable Prudential standards
Conduct complex data analysis and report on results, prepare data for prescriptive and predictive modeling, combine raw information from different sources
Collaborate with data analysts, scientists, and architects on data projects to enhance data acquisition, transformation, organization processes, data reliability, efficiency, and quality
Write unit, integration tests and functional automation, researching problems discovered by quality assurance or product support, developing solutions to address the problems
Bring an applied understanding of relevant and emerging technologies, begin to identify opportunities to provide input to the team and coach others, and embed learning and innovation in the day-to-day
Work on complex problems in which analysis of situations or data requires an in-depth evaluation of various factors
Use programming languages including but not limited to Python, R, SQL, Java, Scala, Pyspark/Apache Spark, Shell scripting

The Skills & Expertise You Bring:
Bachelor of Computer Science or Engineering or experience in related fields
Experience in working with DevOps automation tools & practices; Knowledge of full software development life cycle (SDLC)
Leverage diverse ideas, experiences, thoughts and perspectives to the benefit of the organization
Knowledge of business concepts tools and processes that are needed for making sound decisions in the context of the company's business
Ability to learn new skills and knowledge on an on-going basis through self-initiative and tackling challenges
Excellent problem solving, communication and collaboration skills; enjoy learning new skills!
Applied experience with several of the following:
o Programming Language: Python, R, SQL, Java, Scala, Pyspark/Apache Spark, Shell scripting
o Data Ingestion, Integration & Transformation: Moving data from multiple sources, formats, and volumes to analytics platforms through various tools. Preparing data for further analysis; transforming and mapping raw data to generate insights and wrangling data through tools.
o Database Management System: Storing, organizing, managing, and delivering data using relational DBs, NoSQL DBs, Graph DBs, and data warehouse technologies including AWS Redshift and Snowflake
o Database tools: Data architecture to store, organize, and manage data. Experience with SQL and NoSQL based databases for storage and processing of structured, semi-structured & unstructured data.
Real-Time Analytics: Spark, Kinesis Data Streams
Data Buffering: Kinesis, Kafka
Workflow Orchesration: Airflow, AppFlow, Austosys, Cloudwatch, Splunk
Data Visualization: Tableau, Power BI, MS Excel
o Data Lakes & Warehousing: Building Data Models, Data Lakes and Data Warehousing
o Data Protection and Security: Knowledge of data protection, security principles and services; data loss prevention, role based access controls, data encryption, data access capture and core security services
o Common Infrastructure as Code (IaC) Frameworks: Ansible, CloudFormation
o Cloud Computing: Knowledge of fundamentals of AWS architectural principles and services; Strong ability on cloud formation and to write code; Knowledge of AWS core services
o Testing/Quality: Unit, interface and end user testing concepts and tooling inclusive of non-functional requirements (performance, usability, reliability, security/vulnerability scanning, etc.) including how testing integrated into Dev Ops; accessibility awareness
Preferred qualifications:
o Serverless data pipeline development using AWS Glue, Lambda and Step functions
What we offer you:
Market competitive base salaries, with a yearly bonus potential at every level
Medical, dental, vision, life insurance, disability insurance, Paid Time Off (PTO), and leave of absences, such as parental and military leave
Retirement plans:
401(k) plan with company match (up to 4%)
Company-funded pension plan
Wellness Programsto help you achieve your wellbeing goals, including up to $1,600 a year for reimbursement of items purchased to support personal wellbeing needs
Work/Life...

Equal Opportunity Employer - minorities/females/veterans/individuals with disabilities/sexual orientation/gender identity

DirectEmployers