DE Jobs

Search from over 2 Million Available Jobs, No Extra Steps, No Extra Forms, Just DirectEmployers

Job Information

Affinity Plus Federal Credit Union AWS Architect in Saint Paul, Minnesota

Description Position Overview: An AWS Architect at Affinity Plus builds secure, resilient and highly scalable solutions while keeping simplicity and operational effectiveness in mind. This role collaborates with teammates, product teams, and cross-functional project teams to lead the adoption and integration of the AWS architecture services into the enterprise. You will be responsible for implementing securely architected solutions that are operationally reliable, performant, and deliver on strategic initiatives of tangible, data-driven outcomes. Duties and Responsibilities: Work closely with team members to lead, design, and drive enterprise solutions, advising on key decision points on trade-offs, best practices, and risk mitigation Use a defense-in-depth approach when designing and deploying performant systems that appropriately auto-scale, are highly available, fault-tolerant, self-monitoring, and serviceable Practice AWS Well-Architected six pillars - operational excellence, security, reliability, performance, efficiency, cost optimization, and sustainability Create and maintain secure and performant network designs; assisting with troubleshooting as needed Build out new API integrations Assemble large, complex data sets into workstreams that meet functional and non-functional business requirements Assist and advise data engineers in the preparation and delivery of raw data for prescriptive and predictive modeling into the data lake Be impeccable in your version control while performing commits, branching, and security. Security version control best practices include data encryption, user authentication and authorization, access controls, audit trails, and threat detection. Automate infrastructure provisioning when it makes sense and strive to ensure migrated workloads are cloud-native Partner with developers to continuously improve their ability to develop and deploy applications Build infrastructure for optimal extraction, loading and transformation of data from a wide variety of data sources Work with the developers to troubleshoot, maintain and monitor scalable data pipelines Perform root cause analysis to answer specific business questions and identify opportunities for process improvement Collaborate with Enterprise Digital Intelligence (Edi) team to improve data workflows that feed business intelligence tools to increase data accessibility for staff and foster data-driven decision-making across the organization Use observability and SIEM tools to monitor data and services ensuring production data is secure, has integrity and is available for key stakeholders and the business processes that depend on it Work in a hybrid workflow environment using agile methodologies as well as waterfall project/product management Employ change management best practices to ensure that services remain readily accessible to the business Maintain tools, processes and associated documentation to manage the compute environment Implement reusable design templates and solutions to integrate, automate, and orchestrate cloud operational needs Be a good steward and practice effective cloud governance controls in cloud operations Manage and monitor Windows, Red Hat and CentOS Linux operating systems using tools like Systems Manager and RH Satellite Readily communicate to leadership on topics including outages, updates on key infrastructure items, audit mitigation progress, and security vulnerabilities. Other duties as assigned Qualifications and Skills: Required Qualifications and Skills 2+ years' experience with data lakes, i.e. Databricks, Snowflake, Amazon S3 and/or Lake Formation, etc. 3+ years' of related experience in designing secure, scalable and cost-effective big data architecture 5+ years' experience in a software development, data engineering, or data analytics field using Python, Scala, Spark, Java, or equivalent technologies Bachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of tudy or equivalent work experience Mid-level knowledge of code versioning tools [such as Git, Mercurial or SVN] Expert proficiency in Lambda, Python, C++, Java, R, and SQL programming languages Expert proficiency in IaC tools, i.e. Terraform, Ansible, Cloud Formation Proficiency in software engineering best practices employed in the software development lifecycle, including coding standards, code reviews, source control management, build processes, testing and operations Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of unstructured data Strong understanding and knowledge of financial industry technology standards, compliance requirements, and experience working with audit and regulatory bodies Mid-level experience of the AWS Cloud platform core foundational native services and expertise in the Big Data & AI first party services such as AWS Glue, Amazon Athena, Amazon Kinesis, Amazon QuickSight, Crawlers Proficiency in working with all types of operating systems, especially Linux and Unix. Proficient level experience with architecture design, build and optimization of big data collection, ingestion, storage, processing, and visualization Proficient in building, automating and deploying data pipelines and workflows into end-user facing applications Ability to remain up to date with industry standards and technological advancements that will enhance data quality and reliability to advance strategic initiatives Technical expertise with data models, data mining, and segmentation techniques Working knowledge of RESTful APIs, OAuth2 authorization framework and security best practices for API Gateways Expert at diagnostic and problem resolution providing third-level support Familiarity of working with unstructured data sets (i.e. voice, image, log files, social media posts, email) Possess an organized methodical approach and bring a continuous improvement mindset Demonstrated predisposition for action, willingness to partner, mentor, and an overall innate drive to provide an exceptional member and employee experience Highly creative and innovative technologist that thrives independently and collaborates well in a team environment Strong analytical and decision-making skills with a high degree of accuracy Strong verbal, written, and interpersonal communication skills Time Management skills and the ability to prioritize workloads Preferred Qualifications Experience in a financial institution Expert-level knowledge of AWS infrastructure configurations and services offering Expert-level knowledge of data frameworks, data lakes and open-source projects such as Apache Spark, MLflow, and Delta Lake Experience with MDM using data governance solutions Advanced technical certifications: AWS Certified Cloud Practitioner, Solutions Architect, Certified Developer, or SysOps Administrator certifications; RHEL RHCSA/RHCE; AWS Certified Data Analytics, DASCA Big Data Engineering and Analytics preferred Workplace Environment: Working in a stationary position for 80% of the work day Utilizing the telephone and video conferencing 10-20% of the day Moving, lifting and/or carrying 30 pounds with or without accommodations Bending, twisting, kneeling, stooping or crouching when appropriate, on occasion Repetitive movements, including but not limited to typing, mousing, phones, etc. May require travel for an onsite presence for employee meetings & events for collaboration, connection, project work, All-Employee Day,... For full info follow application link. Affinity Plus is an Affirmative Action/Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, age, protected veteran status or status as an individual with disability.

DirectEmployers