DE Jobs

Search from over 2 Million Available Jobs, No Extra Steps, No Extra Forms, Just DirectEmployers

Job Information

Banco Popular Puerto Rico Senior Data Engineer | AWS in Charlotte, North Carolina

Date: Aug 31, 2024

Location:

Charlotte, NC, US, 28210

Company: Popular

Workplace Type: Remote

Senior Data Engineer | AWS

AtPopular,we offer a wide variety of services and financial solutions to serve our communities in Puerto Rico, United States & Virgin Islands. As employees, we are dedicated to making our customers dreams come true by offering financial solutions in each stage of their life. Our extensive trajectory demonstrates the resiliency and determination of our employees to innovate, reach for the right solutions and strongly support the communities we serve; this is why we value their diverse skills, experiences and backgrounds.

Are you ready for a rewarding career?

Over 8,000 people in Puerto Rico, United States and Virgin Islands work at Popular.

Come and join our community!

The Opportunity

As an AWS Data Specialist, you'll hold a significant position within the Data Engineering & Analytical Enablement pillar, dedicating your advanced expertise to the detailed design, development, and implementation of analytical solutions on AWS. Your primary focus will be on data preprocessing, and ensuring smooth data movement, which are essential for guiding informed decision-making and deriving actionable insights. Your senior position will also involve providing mentorship and leading initiatives to drive the analytical engineering agenda forward.

Your key responsibilities:

You will collaborate with multifaceted teams of specialists spread across various locations to offer a broad spectrum of data and analytics solutions. You will address complicated challenges and propel advancement within the Enterprise Data & Analytics function.

Specifically:

  • Craft and maintain a clear vision, strategy, and objectives for AWS services in the Data & Analytics domain, ensuring their effective application across various projects.

  • Deploy AWS services to build robust data management pipelines, data stores, and feature stores, both online and offline, to augment analytical capabilities.

  • Develop and implement data integration strategies, architectures, and processes to ensure the effective integration of data from multiple sources into the data lake and data warehouse.

  • Collaborate with teams across data architecture, data governance, data security, and business units, in addition to data analysts and other stakeholders to define data integration requirements and understand source data systems.

  • Guide the development of AWS-based data solutions, ensuring they align with business needs and are scalable, reliable, and secure.

  • Monitor and optimize costs related to ELT / ETL services and cloud management.

  • Work in tandem with cybersecurity architects to review AWS data lakes, guaranteeing that data architectures adhere to security standards and organizational policies.

  • Lead and mentor a dynamic team in designing, testing, and strategically deploying cutting-edge solutions for analytical and machine learning infrastructure in the cloud.

  • Set up cloud infrastructure components, including networking, security, storage, data migration, data processing, governance, analytics, log management, and monitoring services.

  • Participate in the creation and enforcement of data privacy policies and programs, ensuring that cybersecurity strategies are properly implemented across cloud Data and analytics platforms and initiatives.

  • Coordinate with corporate DevOps team in designing and building DataOps and ModelOps pipelines to automate and streamline the data pipelines and machine learning model development, Inventory management and deployment processes.

  • Keep abreast of the latest AWS technologies and data trends to perpetually enhance the organization's data and analytics capabilities. Mainly specialized in AWS Glue, EMR, Batch, Lambda, Athena, Kinesis,S3,Lake Formation, DynamoDB, EC2, QuickSight, Elastic Search, RDS, DMS, MEAA, Appflow, EKS, ECS, MSK, Event Bridge, Step Function.

  • Promote and establish software engineering best practices within the analytics team to ensure the delivery of high-quality, dependable analytical solutions.

  • Develop and maintain detailed data-related documentation to ensure clarity and traceability of data processes and models.

  • Establish and enforce data quality standards, rules, and metrics to guarantee the accuracy and integrity of analytical results.

  • Champion software engineering best practices within the analytics team to foster the creation of high-quality, dependable analytical solutions.

  • Adopt version control and DataOps principles to ensure the reproducibility and scalability of analytical models and processes.

  • Perform thorough data testing to detect and correct errors, inconsistencies, and inaccuracies in analytical models and data processes.

  • Apply suitable encoding techniques to prepare data for analytical processing, enhancing the robustness and efficacy of analytical models.

  • Engage in close collaboration with various business units, data engineers, and stakeholders to comprehend business challenges, elicit requirements, and devise analytical solutions that align with organizational objectives.

  • Identify and incorporate new data sources and methodologies to enhance the precision and performance of analytical solutions.

  • Keep abreast of the latest data science and analytics trends and technologies, integrating cutting-edge approaches as applicable.

  • Validate the integrity, reliability, and robustness of analytical methods and their outputs through stringent validation processes.

  • Participate in the design, development, and implementation of analytical models, including predictive analytics, advanced clustering algorithms, and machine learning techniques, to process complex datasets and extract insights.

    To qualify for the role, you must have:

  • Bachelor's degree in Computer Science, Information Systems, Engineering, Statistics, Mathematics, or a related field. A Master’s degree in a related field is a plus.

  • Minimum 20 years of experience in implementing large scale Data & Analytics platform in AWS, Azure, or Google Cloud, on-prem and Hybrid environment.

  • Minimum 10 years of experience in leading and managing various functional team within ED&A such as data integration, data engineering, analytical engineering, BI / data visualization, Data Operations, or a similar role.

  • Experience in leading engineering teams and delivering data capabilities in following waterfall, iterative, scaled agile, scrum, and kanban methodologies.

  • In-depth knowledge of data integration methodologies such as change data capture, ETL & ELT processes, real-time data processing, micro-services, data lifecycle management, data lake, data warehouse, data vault, data mesh, data marketplace and data science concepts.

  • Hands-on experience with On-prem & cloud data platforms such as Snowflake, AWS Redshift, Azure Synapse Analytics, Databricks, AWS Aurora, Oracle Exadata, SQL server, Hadoop, Spark, SAS and R.

  • Proficiency in data integration tools and frameworks such as Informatica PC & IICS, IBM DataStage, DBT, Matillion, Microsoft SSIS, Glue, Batch, Azure data factory, data pipeline, Qlik replicate, Oracle GoldenGate, Shareplex, Apache NiFi and Python based frameworks.

  • Experience in Implementing tools and services in data security and data governance domains such as data modeling, data classification, data access control, data masking, data quality, metadata management, catalog, auditing, balancing, reconciliation, and data privacy compliance like GDPR & CCPA.

  • Strong proficiency in Hive, SQL, Spark, Python, R, SAS or other data manipulation and transformation languages.

  • Experience in handling data streams, APIs, events, container orchestration products such OpenShift, EKS, ECS.

  • Design both online and offline feature stores, providing efficient data access for machine learning models.

  • Implementation experience of one or more AI/ML platforms in cloud such as Sagemaker, Dataiku, DataRobot, H2O.ai, Snowpark, ModelOp Center, and Domino Data Lab.

  • Experience in handling high volume of data in structure, semi-structured and unstructured formats such as relational, flat files, XML, JSON, Parquet, Avro, Mainframe copybooks, CSV, Fixed with and hierarchy files.

  • Experience with DevOps and DataOps products such as Jenkins, Git, GITLab, Maven, Bitbucket, and Jira.

  • Experience in Cloud transformation and implemented various strategies such as Rehost, Re-platform, Repurchase, Refactor / Re-architect , Retire , and Retain.

  • Experience with log integration and observability products such as Splunk, Datadog, Grafana, AppDynamics, and CloudWatch.

  • Hands-on experience in designing and building data pipelines by leveraging AWS services such as S3, S3 Glacier, EC2, ECS, EMR, Sagemaker, IAM, RDS, DynamoDB, Hive,GraphDB, and DocumentDB.

  • Strong analytical, problem-solving, and critical thinking skills.

  • Ability to communicate complex data concepts effectively to a diverse array of stakeholders, both technical and non-technical.

  • A passion for continuous learning and staying abreast of industry innovations and trends

    What we look for:

    We are seeking enthusiastic and proactive leaders who have a clear vision and an unwavering commitment to remain at the forefront of data technology and science. Our ideal candidates are those who aim to foster a team spirit and collaboration and have a knack for adept management. It is essential that you display comprehensive technical proficiency and possess a rich understanding of the industry.

    If you have a genuine drive for helping consumers achieve the full potential of their data while working towards your own development, this role is for you.

Important:The candidate must provide evidence of academic preparation or courses related to the job posting, if necessary.

If you have a disability and need assistance with the application process, please contact us at asesorialaboral@popular.com. This email inbox is monitored for such types of requests only. All information you provide will be kept confidential and will be used only to the extent required to provide reasonable accommodations. Any other correspondence will not receive a response.

As a leading financial institution in the communities we serve, we reaffirm our commitment to always offer essential financial services and solutions for our customers, including during emergency situations and/or natural disasters. Popular’s employees are considered essential workers, whose role is critical in the continuity of these important services even under such circumstances. By applying to this position, you acknowledge that Popular may require your services during and immediately after any such events.

If you are a California resident, please click here to learn more about your privacy rights.

.

Popular is an Equal Opportunity Employer

Learn more about us at www.popular.com and keep updated with our latest job postings at https://jobs.popular.com/usa/ .

Connect with us!

LinkedIn (http://www.linkedin.com/company/popularbank) | Facebook (https://www.facebook.com/popularbank) | Twitter (https://twitter.com/popularbank) | Instagram (https://www.instagram.com/popularbank/) | Blog (http://blog.popularbank.com/)

DirectEmployers