Software Engineer, Data Infrastructure
New York City; San Francisco, California; Chicago, Illinois
Applications have closed
Komodo Health
Access the industry’s most precise view of the U.S. healthcare system through Komodo's Healthcare Map, a combination of the most comprehensive view of patient encounters with innovative algorithms and decades of clinical expertise.We Breathe Life Into Data
At Komodo Health, our mission is to reduce the global burden of disease. And we believe that smarter use of data is essential to this mission. That’s why we built the Healthcare Map — the industry’s largest, most complete, precise view of the U.S. healthcare system — by combining de-identified, real-world patient data with innovative algorithms and decades of clinical experience. The Healthcare Map serves as our foundation for a powerful suite of software applications, helping us answer healthcare’s most complex questions for our partners. Across the healthcare ecosystem, we’re helping our clients unlock critical insights to track detailed patient behaviors and treatment patterns, identify gaps in care, address unmet patient needs, and reduce the global burden of disease.
As we pursue these goals, it remains essential to us that we stay grounded in our values: be awesome, seek growth, deliver “wow,” and enjoy the ride. At Komodo, you will be joining a team of ambitious, supportive Dragons with diverse backgrounds but a shared passion to deliver on our mission to reduce the burden of disease — and enjoy the journey along the way.
The Opportunity at Komodo Health:
You will be joining the Infrastructure team. Our team is lean, competent, and flexible. More importantly, we support and teach each other. We own Komodo’s software infrastructure, which includes all applications, tools, and frameworks use to interact with public clouds like AWS. We build, operate, and constantly improve strategic tools that help with container orchestration, microservice management, ETL, CI/CD, monitoring, metrics, logging, and so on. We also manage third-party services and their procurement. With the rapid growth and deep transformation Komodo is undergoing, the Infrastructure team needs to re-architect and significantly expand our cloud and data infrastructure. Join us, and you will quickly be driving large projects and contributing to Komodo’s success.
We are a Python and AWS shop, and most of us are Python and AWS experts. Our tech stack includes but is not limited to: Kubernetes, Helm, Docker, Spark, Airflow, JupyterHub, Jenkins, ElasticSearch, FastAPI, Terraform, Packer, as well as third-party services like Snowflake, Ascend, Databricks, Great Expectations. We also heavily use AWS core Compute, Storage, and Networking services such as EC2, EBS, S3, Load Balancing, Autoscaling Groups, VPCs, and so on.
Looking back on your first 12 months at Komodo Health, you will:
- Have led at least one strategic project (more for senior positions) end-to-end and be the go-to person in this area at Komodo
- Have participated in multiple strategic projects using technologies in our tech stack (mentioned above). You will be among the few that people go to for help on these technologies
- Have helped other teams resolved major problems and earned their trust
- Have significantly improved your problem solving, software design, product management, and debugging skills
- Be much better in making tradeoffs and translating resource and time constraints into design requirements
- Be comfortable with communicating with all levels of the organization
- Be comfortable with various open-source projects and have perhaps merged PRs
- Be a subject matter expert in technologies that are in high demand
Some Of The Projects We Are Currently Working On:
- Improvements to our Kubernetes architecture to enable more efficient allocation of nodes
- Improve the performance and ease of use of our Spark and Airflow platforms
- Security hardening of cloud infrastructure
- Internal software systems and self-service tools to enable the creation of secure and compliant AWS resources
- Building our centralized logging and metrics collecting infrastructure
- Open-sourcing some of our tools
- CI/CD for applications and data pipelines
~We are waiting for you to define more projects!
What you bring to Komodo:
- Passion! We hope you are passionate about our mission and technology
- Ownership! We hope you own your work, be accountable, and push it through the finish line. We hope you treat yourself as a cofounder and do not hesitate to share any idea that helps Komodo
- Expertise! We do not need you to know everything, but we hope you have deep knowledge in at least one area and can start contributing quickly. And we would love to learn from you in your area(s) of expertise
- Experience with AWS’s core services and their relationships. Ability to create solutions on AWS based on users’ high-level descriptions. Ability to learn new AWS technologies and use them as needed.
- Expertise in any general programming language(s) like Python, Java, C++, Go, Scala, and so on. Ability to solve problems and automate common tasks with your code. Ability to pick up a new programming language quickly and read large code bases written in a new language.
- Experience with cloud and big-data tools and frameworks like Spark, Airflow, Snowflake, and so on.
- Knowledge of Linux fundamentals. Comfortable working on Linux command line (e.g. BASH) and have experience with basic Linux commands and tools.
- Experience with using and managing Kubernetes clusters. Experience with running services on Kubernetes. Basic knowledge of its ecosystem, e.g., helm, minikube.
- Experience with CI/CD pipelines. Knowledge of basic concepts and their applications. Able to create Jenkins pipelines and manage Jenkins instances. Experience with related CI/CD tools like Harness, Travis, CircleCI, GitHub Actions.
- Experience with infrastructure-as-code tools such as Terraform, Cloudformation, and Packer.
- Ability to learn new technical concepts quickly
- Ability to operate in fast-paced and dynamic environment
- Strong collaboration and communication skills
#LI-Remote #LI-BN2
Where You’ll Work
Komodo Health has a hybrid work model; we recognize the power of choice and importance of flexibility for the well-being of both our company and our individual Dragons. Roles may be completely remote based anywhere in the country listed, remote but based in a specific region, or local (commuting distance) to one of our hubs in San Francisco, New York City, or Chicago with remote work options.
What We Offer
On top of our commitment to providing competitive, fair pay for all roles at Komodo Health, we’re proud to offer robust and inclusive benefits to all Dragons at Komodo Health. We offer global time off programs, extensive internal and external career development and learning opportunities, multiple affinity groups celebrating our team’s diversity, and an annual wellness and productivity stipend to support you in being your healthiest, best self.
Equal Opportunity Statement
Komodo Health provides equal employment opportunities to all applicants and employees. We prohibit discrimination and harassment of any type with regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws.
Tags: Airflow Architecture AWS CI/CD Databricks Data pipelines Docker EC2 Elasticsearch ETL GitHub Helm Kubernetes Linux Pipelines Python Scala Security Snowflake Spark Terraform
Perks/benefits: Career development Competitive pay Equity Flex hours Flex vacation Health care Home office stipend Startup environment Wellness
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Lead Data Analyst jobs
- Open MLOps Engineer jobs
- Open Data Science Manager jobs
- Open Senior Business Intelligence Analyst jobs
- Open Data Engineer II jobs
- Open Data Manager jobs
- Open Principal Data Engineer jobs
- Open Sr Data Engineer jobs
- Open Power BI Developer jobs
- Open Business Intelligence Developer jobs
- Open Data Analytics Engineer jobs
- Open Junior Data Scientist jobs
- Open Data Scientist II jobs
- Open Product Data Analyst jobs
- Open Senior Data Architect jobs
- Open Sr. Data Scientist jobs
- Open Business Data Analyst jobs
- Open Big Data Engineer jobs
- Open Data Analyst Intern jobs
- Open Manager, Data Engineering jobs
- Open Azure Data Engineer jobs
- Open Data Product Manager jobs
- Open Data Quality Analyst jobs
- Open Junior Data Engineer jobs
- Open Principal Data Scientist jobs
- Open GCP-related jobs
- Open Data quality-related jobs
- Open Business Intelligence-related jobs
- Open ML models-related jobs
- Open Java-related jobs
- Open Data management-related jobs
- Open Privacy-related jobs
- Open Data visualization-related jobs
- Open Finance-related jobs
- Open Deep Learning-related jobs
- Open PhD-related jobs
- Open APIs-related jobs
- Open TensorFlow-related jobs
- Open PyTorch-related jobs
- Open Consulting-related jobs
- Open NLP-related jobs
- Open Snowflake-related jobs
- Open CI/CD-related jobs
- Open Kubernetes-related jobs
- Open LLMs-related jobs
- Open Generative AI-related jobs
- Open Data governance-related jobs
- Open Hadoop-related jobs
- Open Airflow-related jobs
- Open Docker-related jobs