Python Data Analyst
Remote - Thiruvananthapuram, Kerala, India
Applications have closed
Treez
The leading enterprise commerce platform for cannabis retailers. Point of sale, cashless payments, analytics, compliance, and integrations. Book a demo and see how our technology and insights can help grow your business.Job Description:
Python Data analyst
Location: Trivandrum, India (remote)
The Treez Product Development team is looking for a high energy, team-oriented, software engineering enthusiast to drive our next generation business management platform and applications. If you are looking for an exciting, fast-paced full-time role are a high caliber team player who can lead and innovate the future shape of our industry - this position is for you!
The Treez enterprise-grade, business management platform serves the vertical cannabis market from producers to brands to retailers with our Point of Sale, Integrated Payments, Branding, Grow and Data Insights software solutions. Treez is helping businesses deliver personalized and efficient experiences to their customers. The Treez enterprise business management solution is delivered through a Software-as-a-Service (SaaS) model on a modern, scalable cloud infrastructure.
To learn more, visit Treez at: https://treez.io/
Responsibilities
You will play a pivotal role in adding intelligence to the new and innovative capabilities your team is focused on. Your work will be a central function in our investment plans. You will work alongside our global team of product owners, software developers, and quality assurance teams at Treez.
We are seeking a highly motivated individual looking to grow their career in a fast-paced environment. If you have a passion for developing high-quality, scalable cloud software solutions, then please keep reading!
- Impacts software design, code quality, and maintainability.
- Creates application software through manual and automated processes.
- Drives operational excellence.
- Dev Ops skills to automate development and operational monitoring.
- Innovates to improve the speed of delivery.
- Independently invents, designs and builds stable and performant software.
- Leverages knowledge of a broad range of design approaches to determine when to use / or not.
- Considers code’s legacy, limiting the use of short-term workarounds.
- Is judicious about adding dependencies.
- Efficient with resources: hardware, database, memory/CPU, etc.
- Participates in code reviews, design discussions, team planning, and metrics reviews.
- Develops software enhancements that improve team software and processes.
- Documents and mentors fellow team members in software and application development best practices
- methods.
- Works with peers and business partners to document business processes and information technology
- analyze requirements, and prepare functional and technical specifications.
- Provides support of existing and new applications when escalated.
- Assist in troubleshooting and resolution of technical issues and processes.
- Develop reporting systems that provide accessible information for decision-making
- Use warehouse/data lake sources to generate reports to support business decision-making and business
- monitoring
- Maintain data analytics platforms
- Evaluate and improve existing BI systems
- Conduct troubleshooting on BI models and reports
- Generate and deliver quality reports to customers and perform quality assurance checks on reports
Skills & Abilities
- Experience working with Amazon Web Services (AWS) services including Gateway API, Lambda,
- Formation to implementing products features that scale.
- Proficient in a broad range of design approaches and know when it is appropriate to use them (and when
- is not).
- Can demonstrate experience solving difficult technical problems in medium-to-large software projects.
- Has demonstrated strong code writing ability across several platforms, environments, and/or
- languages
- demonstrated knowledge of relational databases (SQL). and NoSQL databases.
- PySpark experience preferred
- Experience as a BI developer.
- Experience in data warehouse design.
- Working knowledge of BI technologies.
- Strong problem-solving and analytical skills.
- Strong attention to detail.
- Good communication skills
Qualifications
- 2+ years of software experience using enterprise class languages and tools
- 2+ years of experience with Cloud technologies, especially AWS
- 2+ years of experience with multi-tier software architecture
- Computer software degree or, equivalent work experience with Computer Science or Engineering background
- Knowledge and experience in enterprise software including strongly and loosely typed software development languages, API development and use, enterprise software, cloud architectures, and web services architectures
- Working knowledge of enterprise integration architectures and patterns
- Working knowledge of cloud computing and Software as a Service (SaaS) architecture
Environment and Physical Demands
- Able to transport portable computer: approximately 20 pounds.
- Work from home 100%.
- May require work at Treez US location, or partner locations, which will require travel from India to the United States.
- Travel within the United States and internationally as needed.
Treez is an equal opportunity employer. We thrive on diversity and are committed to creating an inclusive environment for all employees.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: API Development APIs AWS Computer Science Data Analytics Engineering Lambda NoSQL PySpark Python RDBMS SQL
Perks/benefits: Career development
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Data Science Manager jobs
- Open MLOps Engineer jobs
- Open AI Engineer jobs
- Open Senior Business Intelligence Analyst jobs
- Open Sr Data Engineer jobs
- Open Data Engineer II jobs
- Open Data Manager jobs
- Open Principal Data Engineer jobs
- Open Power BI Developer jobs
- Open Data Analytics Engineer jobs
- Open Junior Data Scientist jobs
- Open Product Data Analyst jobs
- Open Data Scientist II jobs
- Open Senior Data Architect jobs
- Open Business Intelligence Developer jobs
- Open Sr. Data Scientist jobs
- Open Manager, Data Engineering jobs
- Open Big Data Engineer jobs
- Open Data Analyst Intern jobs
- Open Data Quality Analyst jobs
- Open Business Data Analyst jobs
- Open Data Product Manager jobs
- Open Junior Data Engineer jobs
- Open ETL Developer jobs
- Open Principal Data Scientist jobs
- Open Data quality-related jobs
- Open Business Intelligence-related jobs
- Open ML models-related jobs
- Open GCP-related jobs
- Open Data management-related jobs
- Open Privacy-related jobs
- Open Java-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open APIs-related jobs
- Open Deep Learning-related jobs
- Open PyTorch-related jobs
- Open Consulting-related jobs
- Open Snowflake-related jobs
- Open TensorFlow-related jobs
- Open PhD-related jobs
- Open CI/CD-related jobs
- Open NLP-related jobs
- Open Data governance-related jobs
- Open Kubernetes-related jobs
- Open Airflow-related jobs
- Open Hadoop-related jobs
- Open Databricks-related jobs
- Open LLMs-related jobs
- Open DevOps-related jobs