No. of open positions: 1
Velotio is looking for a Senior Data Engineer to help us architect, design and develop ETL data pipelines and data lakes for several customers across IoT, consumer and enterprise space.
- Design and build data infrastructure with efficiency, reliability and consistency to meet rapidly growing data needs
- Design data pipelines and data integrations to collect, clean, and store large datasets (streaming and batch)
- Help establish and maintain a high-level of operational excellence in data engineering
- Evaluate, integrate and build tools to accelerate Data Engineering, Data Science, Business Intelligence, Reporting and Analytics as needed
Experience: 3-6 years
- 2+ years data engineering or equivalent knowledge and ability
- 4+ years software engineering or equivalent knowledge and ability
- Designing and maintaining at least one type of database (object, columnar, in-memory, relational) experience
- Experience with any of the database types - Relational, object, tabular, key-value, triple-store, tuple-store, etc
- Experience with data warehouse modernization, building data-marts, star/snowflake schema designs, infrastructure components, ETL/ELT pipelines and BI/reporting/analytic tools
- Extensive hands-on experience with batch and stream data processing (e.g., Airflow, Spark, Kinesis, Kafka)
- Experience with OLAPs like Redshift, Snowflake etc.
- Advanced SQL skills and strong proficiency in at least two of the following programming languages: Python, Scala, and Java
- Familiarity with pandas, SciPy, scikit-learn, seaborn, SparkML
- Experience with Machine Learning production at scale is a bonus
- Excellent cross-functional collaboration and communication skills
We have a fairly autonomous work culture, with very little hierarchy. This encourages people to take ownership and accountability. We want to hire curious folks who will fit into this culture. If this sounds like the sort of thing you are looking for, we should connect quickly. We are looking forward to hearing from you.