Data Engineer
Perpetua
This job is no longer accepting applications
See open jobs at Perpetua.See open jobs similar to "Data Engineer" Work In Tech.Software Engineering, Data Science
What you'll do:
- Write high-level, well-documented code in Python and SQLBuild data pipelines that range from simple to complex, using technologies like Apache Airflow and AWS Lambda, Step Functions, and EventBridge, and other AWS serverless technologies.
- Build ETL pipelines with Snowflake, AWS Glue, pyspark and other ETL tools.
- Work with a mix of structured and unstructured data across cloud-based batch and streaming architectures
- Engage directly with technical analysts, project managers, and other technical teams to help build concise requirements and ensure timely completion of projects
- Work with Git, CI/CD, and version control to maintain code and documentation
- Design and vet solutions for technical problems, and solicit team feedback during the design process
- Mentor, manage, train, and participate in paired programming in a lead capacity
Who you are:
- Must have experience with version control, GitHub, and software development life cycle
- 4 years experience with SQL and data modeling
- 4 years experience developing with Python
- Demonstrated experience interacting with RESTful APIs
- Experience with data pipelines / batch automation in at least one major technology (e.g. Apache Airflow)
- Experience with one of the major cloud providers (AWS-preferred)
- AWS Serverless (lambda, eventbridge, step functions, sqs)Experience working in an agile development environmentStreaming experience (kafka, kinesis, etc.)
- Familiarity with JiraExperience with other AWS technologies: EC2, Glue, Athena, etc.
- Experience with additional cloud platforms beyond AWS
- Experience developing CI/CD, automations, and quality of life improvements for developers
This job is no longer accepting applications
See open jobs at Perpetua.See open jobs similar to "Data Engineer" Work In Tech.