Key Responsibilities

  • Design, implement, and deploy enterprise data solutions using cutting-edge cloud-based technologies
  • Follow Agile methodologies to release iterative feature sets very rapidly
  • Research and introduce new solutions and technologies to the project and stakeholders, provide technical guidance and suggest improvements in development
  • Work independently, without guidance, with broader overview on the project
  • Coordinate with other teams as part of a larger data-sharing system
  • Employ software development best practices such as automated testing, peer code reviews, continuous integration, and continuous delivery
  • Translate business requirements and develop technical specifications
  • Communicate clearly and document processes
  • Perform quality assurance and testing of your work
  • Contribute to a collaborative, positive, stimulating, and enjoyable environment for your development team

Qualifications

  • Bachelor’s degree in Computer Science, Engineering, Information Systems or equivalent experience
  • Must have 5+ years of work experience with programming languages and object-oriented design (Python preferred)
  • Must have strong database fundamentals including SQL, relational and  non-relational data models and schema designs, and understanding of database performance implication
  • Must have understanding of cloud-based technologies (AWS, GCP or Azure, AWS preferred)
  • Must have experience leveraging automated tests for code validation and test-driven development
  • Must have experience building and deploying products using continuous integration principles
  • Must have working knowledge of software engineering and development methodologies, techniques, and tools, including Issue Tracking (like JIRA), code repositories (like Git, Bitbucket) and the Software Development Lifecycle

Required Skills:

  • Experience building workflow orchestration, logging, error handling and automated testing utilizing Python and the Pytest framework
  • Understanding of “Big Data” ETL methodologies and managing large scale data sets
  • Strong understanding of data structures, algorithms, and distributed systems

Desired Skills:

  • Experience with Snowflake data warehouse including scheduled tasks, table streams and JavaScript stored procedures
  • Experience with AWS services (such as S3, EC2, RDS, EMR, Lambda or SNS/SQS)
  • Experience with data processing workflow systems (Apache Nifi, Talend or Airflow)
  • Experience with creating reports, dashboards and visualizations (Tableau preferred)

Location

  • Work from office (Hinjewadi, Pune)

To join us send your resume to hr@fluid.live