DE Jobs

Search from over 2 Million Available Jobs, No Extra Steps, No Extra Forms, Just DirectEmployers

Job Information

SMBC Software Engineer - Data (Remote) in Scottsdale, Arizona

Join us on our mission to create a completely new, 100% digital bank that truly serves customers' best interests. We are a close-knit and fun-loving team of seasoned financial services professionals who came together for the challenge of building a bank from scratch - and we are committed to doing it all the right way (from technology infrastructure to modern marketing to customer experience).

The anticipated salary range for this role is between $75,000.00 and $150,000.00. The specific salary offered to an applicant will be based on their individual qualifications, experiences, and an analysis of the current compensation paid in their geography and the market for similar roles at the time of hire. The role may also be eligible for an annual discretionary incentive award. In addition to cash compensation, SMBC offers a competitive portfolio of benefits to its employees.

We work with the flexibility and speed of a start-up. But we also have significant stability and capital from being part of the SMBC Group (Sumitomo Mitsui Banking Corporation). SMBC is the second largest bank in Japan and the 12th largest bank in the world with operations in over forty countries. And SMBC is committed to disrupting the US marketplace with ground-breaking products.

It is the best of both worlds, and we are seeking proven marketing leaders to propel us towards a national launch. We have both the ambitious growth plans and the 'patient capital' necessary to execute a multi-year plan. Join us on the journey to deliver an exciting concept of evolved banking.

JOB SUMMARY:

Jenius Bank is looking for a hands-on Sr. Software Engineer - Data proficient in Java, Scala, and Python languages. You'll be part of the team that is responsible for building Data and Analytics Platform for the Digital Bank Unit. As a Sr. Software Engineer - Data on the team, you will get an opportunity to perform proof of concept on new cloud technologies and build a highly scalable, data platform to support critical business functions, create rest APIs to expose data services for internal and external consumers.

PRINCIPAL DUTIES AND RESPONSIBILITIES:

  • A solid experience and understanding of considerations for large-scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is necessary.

  • Monitors the Data Lake constantly and ensures that the appropriate support teams are engaged at the right times.

  • Design, build and test scalable data ingestion pipelines, perform end to end automation of ETL process for various datasets that are being ingested.

  • Determine the best way to extract application telemetry data, structure it, send to proper tool for reporting (Kafka, Splunk).

  • Create reports to monitor usage data for billing and SLA tracking.

  • Work with business and cross-functional teams to gather and document requirements to meet business needs.

  • Provide support as required to ensure the availability and performance of ETL/ELT jobs.

  • Provide technical assistance and cross training to business and internal team members.

  • Collaborate with business partners for continuous improvement opportunities.

POSITION SPECIFICATIONS:

  • Bachelor's degree in Computer Science, Computer Engineering, or Information Systems Technology

  • 6+ years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics.

  • 4+ years of experience with one of the leading public clouds.

  • 4+ years of experience in design and build of salable data pipelines that deal with extraction, transformation, and loading.

  • 4+ years of experience with Python, Scala with working knowledge on Notebooks.

  • 1+ years hands on experience on GCP Cloud data implementation projects (Dataflow, DataProc, Cloud Composer, Big Query, Cloud Storage, GKE, Airflow, etc.).

  • At least 2 years of experience in Data governance and Metadata Management.

  • Ability to work independently, solve problems, update the stake holders.

  • Analyze, design, develop and deploy solutions as per business requirements.

  • Strong understanding of relational and dimensional data modeling.

  • Experience in DevOps and CI/CD related technologies.

  • Excellent written, verbal communication skills, including experience in technical documentation and ability to communicate with senior business managers and executives.

EOE STATEMENT

We are an equal employment opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, national origin, disability status, protected veteran status or any other characteristic protected by law.

CCPA DISCLOSURE

Personal Information Collection Notice: This notice contains information under the California Consumer Privacy Act (CCPA) about the categories of personal information (PI) of California residents that Manufacturers Bank collects and the business or commercial purpose(s) for which the PI may be used. We do not sell PI. More information about our collection and use of PI may be found in our CCPA Privacy Policy at https://www.manufacturersbank.com/CCPA-Privacy. Persons with disabilities may contact our Customer Contact Center toll-free at (877) 560-9812 to request the information in this Notice in an alternative format.

DirectEmployers