Join
phData, a dynamic and innovative leader in the modern data stack. We partner with major cloud data platforms like Snowflake, AWS, Azure, GCP, Fivetran, Pinecone, Glean and dbt to deliver cutting-edge services and solutions.We're committed to helping global enterprises overcome their toughest data challenges.
phData is a remote-first global company with employees based in the United States, Latin America and India. We celebrate the culture of each of our team members and foster a community of technological curiosity, ownership and trust. Even though we're growing extremely fast, we maintain a casual, exciting work environment. We hire top performers and allow you the autonomy to deliver results.
- 5x Snowflake Partner of the Year (2020, 2021, 2022, 2023, 2024)
- Fivetran, dbt, Atlation, Matillion Partner of the Year
- #1 Partner in Snowflake Advanced Certifications
- 600+ Expert Cloud Certifications (Sigma, AWS, Azure, Dataiku, etc)
- Recognized as an award-winning workplace in US, phData-is-proud-to-announce-that-it-has-been-awarded-the-prestigious-best-place-to-work-certification-for-2024#:~:text=Press-,phData%20is%20proud%20to%20announce%20that%20it%20has%20been%20awarded,to%20Work%22%20certification%20for%202024.&text=One%20of%20the%20industry's%20most,leading%20technology%20employers%20in%20India.">India and LATAM
We are seeking qualified DevOps engineers to join our growing Cloud Data Operations and services team in Bangalore, India, as we continue our rapid growth with an expansion of our Indian subsidiary, phData Solutions Private Limited. This expansion comes at the right time with increasing customer demand for data and platform solutions.
In addition to the phenomenal growth and learning opportunities, we offer a competitive compensation plan, including base salary, annual bonus, training, certifications, and equity.
As a DevOps Engineer on our Consulting Team, you will be responsible for technical delivery for technology projects related to Snowflake, Cloud Platform (AWS/Azure/), and services hosted on Cloud.
Responsibilities:
- Operate and manage modern data platforms - from streaming, to data lakes, to analytics, and beyond across a progressively evolving technical stack.
- Ability to learn new technologies in a quickly changing field
- Owns execution of Tasks and questions around Tasks other Engineers are working on related to the project.
- Responds to Pager Incidents. Solves hard and challenging problems. Goes deep into customer processes and workflows to solve issues.
- Demonstrate clear ownership of tasks on multiple simultaneous customer accounts across a variety of technical stacks.
- Continually grow, learn, and stay up-to-date with the MS technology stack.
- 24/7 rotational shifts
Required Experience:
- Working knowledge of SQL and the ability to write, debug, and optimize SQL queries.
- Good understanding of writing and optimising Python programs.
- Experience in providing operational support across a large user base for a cloud-native data warehouse (Snowflake and/or Redshift).
- Experience with cloud-native data technologies in AWS or Azure.
- Proven experience learning new technology stacks.
- Strong troubleshooting and performance tuning skills.
- Client-facing written and verbal communication skills and experience.
Preferred Experience:
- Production experience and certifications in core data platforms such as Snowflake, AWS, Azure, GCP, Hadoop, or Databricks.
- Production experience working with Cloud and Distributed Data Storage technologies such as S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra or other NoSQL storage systems.
- Production experience working with Data integration technologies such asSpark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, HVR, NiFi, AWS Data Migration Services, Azure DataFactory or others.
- Production experience working with Workflow Management and Orchestration such as Airflow, AWS Managed Airflow, Luigi, NiFi.
- Working experience with infrastructure as code using Terraform or Cloud Formation.
- Expertise in scripting language to automate repetitive tasks (preferred Python).
- Well versed with continuous integration and deployment frameworks with hands-on experience using CI/CD tools like Bitbucket, Github, Flyway, Liquibase.
- Bachelor's degree in Computer Science or a related field
Perks and Benefits
- Medical Insurance for Self & Family
- Medical Insurance for Parents
- Term Life & Personal Accident
- Wellness Allowance
- Broadband Reimbursement
- Professional Development Allowance
- Reimbursement of Skill Upgrade Certifications
- Certification Reimbursement
phData celebrates diversity and is committed to creating an inclusive environment for all employees. Our approach helps us to build a winning team that represents a variety of backgrounds, perspectives, and abilities. So, regardless of how your diversity expresses itself, you can find a home here at phData. We are proud to be an equal opportunity employer. We prohibit discrimination and harassment of any kind based on race, color, religion, national origin, sex (including pregnancy), sexual orientation, gender identity, gender expression, age, veteran status, genetic information, disability, or other applicable legally protected characteristics. If you would like to request an accommodation due to a disability, please contact us at People Operations.