We are an early stage start-up, building new fintech products for small businesses. Founders are IIT-IIM alumni, with prior experience across management consulting, venture capital and fintech startups. We are driven by the vision to empower small business owners with technology and dramatically improve their access to financial services. To start with, we are building a simple, yet powerful solution to address a deep pain point for these owners: cash flow management. Over time, we will also add digital banking and 1-click financing to our suite of offerings.
We have developed an MVP which is being tested in the market. We have closed our seed funding from marquee global investors and are now actively building a world class tech team. We are a young, passionate team with a strong grip on this space and are looking to on-board enthusiastic, entrepreneurial individuals to partner with us in this exciting journey. We offer a high degree of autonomy, a collaborative fast-paced work environment and most importantly, a chance to create unparalleled impact using technology.
Reach out if you want to get in on the ground floor of something which can turbocharge SME banking in India!
Technology stack at Velocity comprises a wide variety of cutting edge technologies like, NodeJS, Ruby on Rails, Reactive Programming,, Kubernetes, AWS, NodeJS, Python, ReactJS, Redux (Saga) Redis, Lambda etc.
Responsible for building data and analytical engineering pipelines with standard ELT patterns, implementing data compaction pipelines, data modelling and overseeing overall data quality
Work with the Office of the CTO as an active member of our architecture guild
Writing pipelines to consume the data from multiple sources
Writing a data transformation layer using DBT to transform millions of data into data warehouses.
Implement Data warehouse entities with common re-usable data model designs with automation and data quality capabilities
Identify downstream implications of data loads/migration (e.g., data quality, regulatory)
What To Bring
2+ years of software development experience, a startup experience is a plus.
Past experience of working with Airflow and DBT is preferred
2+ years of experience working in any backend programming language.
Strong first-hand experience with data pipelines and relational databases such as Oracle, Postgres, SQL Server or MySQL
Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development)
Experienced with the formulation of ideas; building proof-of-concept (POC) and converting them to production-ready projects
Experience building and deploying applications on on-premise and AWS or Google Cloud cloud-based infrastructure
Basic understanding of Kubernetes & docker is a must.
Experience in data processing (ETL, ELT) and/or cloud-based platforms
Working proficiency and communication skills in verbal and written English.