Job Description :
Acko is India's first and only all-digital Insurtech product company. Through innovative digital products, customised pricing and use of data and tech, we are changing how insurance works, and is perceived by users in India. Although we are solving for the Indian market, we are part of a global wave of Insurtech startups that are creating success through technology and business model disruption—ZhongAn in China ($11 Bn valuation),Oscar($3 Bn valuation),Lemonade, Metromile in the US are some of the others rejigging this space. We are a well-funded series-D company backed by a slate of marquee investors including Binny Bansal, Amazon, Ascent capital, Accel, SAIF and Catamaran. While FY21 will only be our third year of operations, on the back of a steep growth trajectory the past 2 years, we are expecting strong financial growth. We clocked roughly $20M in premiums(revenue) in our first year of operations and ended FY20 with over $50M in premiums, a growth rate of 150%. Through partnerships with large internet players such as Amazon, Ola, RedBus, Oyo, Lendingkart, ZestMoney, GOMMT group etc, our micro insurance product has reached ~50M unique users. Our commitment to build a diverse team, innovative products and a collaborative culture has earned us many accolades and awards. We’re ‘Great Place to Work’ certified and have consistently featured on Linkedin’s list of top startups. A team of 450 and counting, we are growing at an unstoppable pace and would love for you to be a part of this incredible journey. Role: As a Data Engineer at Acko, you will be working on collecting, storing, processing and analysis of huge datasets. The data may be available from heterogeneous sources and this needs to be collected either in batch or in real-time.The primary focus will be on choosing optimal and highly scalable solutions to use for these purposes and then maintaining, implementing and monitoring them. You will be part of a team building the next generation data warehouse platform, implementing new technologies and new practices in existing data pipelines, extending or migrating to new architecture as needed. You will be responsible to support the rapidly growing and dynamic business demand for data and make them available for the business decisions, mostly consumed by Data Analytics and Data Science team which will have an immediate influence on day-to-day decision making at Acko. Responsibilities Design, develop and maintain scalable data pipelines with a focus on writing clean, fault-tolerant code using Python, Airflow, Kafka, Spark, Apache Beam, Dataflow or similar Big Data solutions on Cloud Platforms. Design and manage batch and real-time data ingestion from multiple data sources and messaging systems. Collaborate with analytics/data science teams on data mart optimizations, query tuning, database design and Data Modeling. Monitor performance and advise any necessary infrastructure changes Modelling data and metadata to support ad-hoc and pre-built reporting Work with product and engineering teams on different data driven products and drive/implement the Data computation pipeline for the same.
Own the design, development and maintenance of ongoing metrics, reports, dashboards etc on Data Platform to drive key business decisions. Requirements Bachelors/Masters in Computer Science, Engineering, Statistics, Mathematics or a related field Proficient understanding of distributed computing principles. 2+ years of industry experience in the Big Data stack. Demonstrated ability in data modeling, ETL development, and data warehousing Knowledge and exposure on Big Data solutions on cloud platforms like GCP/AWS/Azure Experience with cloud MPP DW solutions like Google BigQuery/Redshift/Azure SQL DW Experience with various messaging systems such as Kafka, RabbitMQ, Kinesis etc. Knowledge of various ETL techniques and frameworks Experience with Python, Airflow, Spark SQL/NoSQL. Strong SQL and data modelling skills with solid knowledge of various industry standards such as dimensional modelling, star schemas etc Preferred Qualifications Experience with big data tools such as Cloud Dataflow/BigQuery etc. Experience and exposure with GCP is Plus. Good to have knowledge on PostgreSQL Databases Familiarity with reporting tools like Tableau or other BI packages. Acko is an equal opportunity employer. We welcome and encourage diversity in the workplace regardless of race, gender, religion, age, sexual orientation, gender identity, disability or veteran status. Posted On :
Acko is India's first and only all-digital Insurtech product company. Through innovative digital products, customised pricing and use of data and tech, we are changing how insurance works, and is perceived by users in India.
Although we are solving for the Indian market, we are part of a global wave of Insurtech startups that are creating success through technology and business model disruption—ZhongAn in China ($11 Bn valuation),Oscar($3 Bn valuation),Lemonade, Metromile in the US are some of the others rejigging this space.
We are a well-funded series-D company backed by a slate of marquee investors including Binny Bansal, Amazon, Ascent capital, Accel, SAIF and Catamaran. While FY21 will only be our third year of operations, on the back of a steep growth trajectory the past 2 years, we are expecting strong financial growth. We clocked roughly $20M in premiums(revenue) in our first year of operations and ended FY20 with over $50M in premiums, a growth rate of 150%. Through partnerships with large internet players such as Amazon, Ola, RedBus, Oyo, Lendingkart, ZestMoney, GOMMT group etc, our micro insurance product has reached ~50M unique users.
Our commitment to build a diverse team, innovative products and a collaborative culture has earned us many accolades and awards. We’re ‘Great Place to Work’ certified and have consistently featured on Linkedin’s list of top startups. A team of 450 and counting, we are growing at an unstoppable pace and would love for you to be a part of this incredible journey.
Role
As a Data Engineer at Acko, you will be working on collecting, storing, processing and analysis of huge datasets. The data may be available from heterogeneous sources and this needs to be collected either in batch or in real-time.The primary focus will be on choosing optimal and highly scalable solutions to use for these purposes and then maintaining, implementing and monitoring them. You will be part of a team building the next generation data warehouse platform, implementing new technologies and new practices in existing data pipelines, extending or migrating to new architecture as needed. You will be responsible to support the rapidly growing and dynamic business demand for data and make them available for the business decisions, mostly consumed by Data Analytics and Data Science team which will have an immediate influence on day-to-day decision making at Acko.
Responsibilities
- Design, develop and maintain scalable data pipelines with a focus on writing clean, fault-tolerant code using Python, Airflow, Kafka, Spark, Apache Beam, Dataflow or similar Big Data solutions on Cloud Platforms.
- Design and manage batch and real-time data ingestion from multiple data sources and messaging systems.
- Collaborate with analytics/data science teams on data mart optimizations, query tuning, database design and Data Modeling.
- Monitor performance and advise any necessary infrastructure changes
- Modelling data and metadata to support ad-hoc and pre-built reporting
- Work with product and engineering teams on different data driven products and drive/implement the Data computation pipeline for the same.
- Own the design, development and maintenance of ongoing metrics, reports, dashboards etc on Data Platform to drive key business decisions.
Requirements
- Bachelors/Masters in Computer Science, Engineering, Statistics, Mathematics or a related field
- Proficient understanding of distributed computing principles.
- 2+ years of industry experience in the Big Data stack.
- Demonstrated ability in data modelling, ETL development, and data warehousing
- Knowledge and exposure to Big Data solutions on cloud platforms like GCP/AWS/Azure
- Experience with cloud MPP DW solutions like Google BigQuery/Redshift/Azure SQL DW
- Experience with various messaging systems such as Kafka, RabbitMQ, Kinesis etc.
- Knowledge of various ETL techniques and frameworks
- Experience with Python, Airflow, Spark SQL/NoSQL.
- Strong SQL and data modelling skills with solid knowledge of various industry standards such as dimensional modelling, star schemas etc
Preferred Qualifications
- Experience with big data tools such as CloudDataflow/BigQuery etc.
- Experience and exposure with GCP is Plus.
- Good to have knowledge on PostgreSQL Databases
- Familiarity with reporting tools like Tableau or other BI packages.
- Open Positions: 1