Cloud Data Engineer - Healthcare
Portland, ME, USA
Arkatechture
Arkatechture is a data & analytics consulting firm in Portland, ME that provides the expertise, tools, and training needed for true business intelligence.Department: Professional Services
Employment Type: Full Time
Location: Portland, ME, USA
Description
Why Work Here?At Arkatechture, we have a simple shared mission: to build a sustainable organization built upon three pillars: Do something meaningful, With a great team, Earning what you deserve.We started in 2012 with a passion for data, business, and getting things done. We are a team of data lovers and technical experts who use our skills to help businesses big and small harness, utilize, and optimize their data. As New England’s Data Resource, we are a small company constantly evolving to keep up with changing landscapes in the data world.
We are proud of the community and culture that we’ve created at Arkatechture, and we have no intention of slowing down. We offer a competitive benefits package that includes:
- A flexible work-from-home policy (work 100% remotely!)
- Open-concept offices in Portland, ME with an easy-going dress code, and fresh pots and pops all day (that’s coffee and popcorn!)
- Training & certificate reimbursement
- A competitive benefits package that includes medical, disability, life insurance and optional dental/vision
- 401K Retirement planning with company matching
- Generous paid time off and eleven paid holidays
- Employee recognition through milestone awards including annual PTO increases and a 4 day work-week at 3 years of service!
The Position As a Cloud Data Engineer specializing in Healthcare, you will be a key member of our engineering team, responsible for architecting, developing, and optimizing publish/subscribe messaging systems with a specific focus on Amazon Web Services (AWS) services. You will collaborate with cross-functional teams to design and implement robust publish/subscribe solutions that leverage a bi-directional data exchange architecture using services such as AWS messaging services, Apache Kafka, and other FedRamp-authorized Cloud Service Offerings along with industry-standard protocols, enabling efficient and real-time data exchange using a hub-and-spoke framework within our software ecosystem.
How to Apply
Please send a cover letter and resume with your application. You must have 3+ years of experience working for a Medicaid agency and you must submit all requested documents to be considered for the position.
Key Responsibilities
- Experience with designing and implementing highly secure FedRamp-Authorized Cloud Service Offerings (CSO) such as AWS GovCloud, Amazon MSK, Okta, Snowflake, etc.
- Experience designing and building data pipelines using AWS services
- Experience with developing bidirectional data exchange systems
- SQL/Python development experience especially for serverless computing and event-based triggers
- Developing and Testing of code
- Working with Senior Data engineers on the team for a full end-to-end delivery of projects/solutions
- Communicate with both technical and non-technical collaborators
- Follow Engineering best practices
- Status reporting to Team lead on a regular cadence
- Estimation and working with Project manager on task allocation
- Additional responsibilities as assigned
Skills, Knowledge and Expertise
Minimum Qualifications- 3+ years of experience working for a Medicaid agency as an employee or as a consultant working with Medicaid systems and data
- 3+ years of experience in a similar individual contributor role
- Bachelor's degree in a related field or comparable work experience
- Excellent SQL skills and understands implementation of conformed data models
- Experience working on Snowflake along with 2 additional databases such as SQL Server, Oracle, Aurora, PostgreSQL, Redshift, MySQL etc
- Experience with developing in Python; JavaScript is a nice to have
- Experience working on Data Management projects for Data Lakes/Data Warehousing
- Experience working with APIs, specifically REST APIs, SDKs and CLI tools as part of ETL/ELT provisioning
- Experience working with multiple file formats such as JSON, XML, CSV, Flat, etc
- Experience extracting data from databases using ODBC/JDBC
- Strong understanding of Microservices architecture
- A strong understanding of Agile software development life cycle and methodology
Preferred Experience
- One or more of the following certifications:
- AWS Solutions Architect Associate/Professional
- AWS Developer Associate/Professional
- Snowflake SnowPro Core/Advanced
- Domain expertise in Healthcare and one or more of the following verticals: Financial Services, Retail, Telco, Digital Marketing, Supply Chain, or Transportation
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile APIs Architecture AWS CSV Data management Data pipelines Data Warehousing ELT Engineering ETL JavaScript JSON Kafka Microservices MySQL Oracle Pipelines PostgreSQL Python Redshift SDLC Snowflake SQL Testing XML
Perks/benefits: Career development Flex hours Flex vacation Health care Insurance
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Data Science Manager jobs
- Open Marketing Data Analyst jobs
- Open Senior Business Intelligence Analyst jobs
- Open Data Engineer II jobs
- Open MLOps Engineer jobs
- Open Data Manager jobs
- Open Principal Data Engineer jobs
- Open Power BI Developer jobs
- Open Data Scientist II jobs
- Open Business Intelligence Developer jobs
- Open Junior Data Scientist jobs
- Open Sr Data Engineer jobs
- Open Business Data Analyst jobs
- Open Data Analytics Engineer jobs
- Open Data Analyst Intern jobs
- Open Product Data Analyst jobs
- Open Sr. Data Scientist jobs
- Open Senior Data Architect jobs
- Open Big Data Engineer jobs
- Open Principal Data Scientist jobs
- Open Data Quality Analyst jobs
- Open Manager, Data Engineering jobs
- Open Research Scientist jobs
- Open Azure Data Engineer jobs
- Open Junior Data Engineer jobs
- Open Data quality-related jobs
- Open GCP-related jobs
- Open Java-related jobs
- Open Business Intelligence-related jobs
- Open ML models-related jobs
- Open Data management-related jobs
- Open Privacy-related jobs
- Open PhD-related jobs
- Open Deep Learning-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open PyTorch-related jobs
- Open TensorFlow-related jobs
- Open APIs-related jobs
- Open NLP-related jobs
- Open Consulting-related jobs
- Open Snowflake-related jobs
- Open LLMs-related jobs
- Open CI/CD-related jobs
- Open Generative AI-related jobs
- Open Kubernetes-related jobs
- Open Hadoop-related jobs
- Open Data governance-related jobs
- Open Airflow-related jobs
- Open Databricks-related jobs