Data Engineer

Job Link
Pairin
Company Verified
Budget Fixed Price
Flexibility Full remote
Preferred timezone Central America Standard Time

About this job

Role: Data Analyst
Category: IT Jobs

Job description

Data Engineer

We’re looking for an energetic and enthusiastic Data Engineer to join our team.

You will work closely with our Product and Engineering teams to support our clients. You will collaborate with other team members to maintain and expand our data infrastructure.

PAIRIN’s mission is to make everyone’s journey more relevant and equitable by unifying workforce and education. PAIRIN’s My Journey platform connects students and clients to services, career guidance, skills development, and training through easy-to-implement personalized workflows. By disrupting government and education through technology designed for and around the user, PAIRIN’s science-based technology and personalized approach to client service ensures not only the success of our clients’ programs, but also the success of people in their communities who benefit most from skills-based development and career services.

This position will ideally be hybrid based in our Denver office, but we will also consider a remote role for the right person. The initial base salary range for this position is $120,000 to $160,000. Plus, generous stock options and an annual performance-based bonus.

Why work for us?

Competitive salary, professional development budget, and stock options

Excellent healthcare, vision, and dental insurance, including 100% of individual and partial family premium covered by employer

Flexible work hours and location

Unlimited PTO

Support for whatever makes you most effective at doing your job

We have a cool, dog-friendly office space in Denver’s RiNo District, with a fun team and lots of amenities

Recognized as a growing company:

2022 Best Places To Work for Small Colorado Companies (BuiltIn)

2021 EdTech Breakthrough Award for College Prep Company of the Year

2021, 2019 and 2018 Outside’s 50 Best Places to Work (and honorable mention in 2020)

2019 Denver Business Journal Best Places to Work (#2 in Tech)

2017 Emerging Tech Company of the Year (Apex Awards), 2017 Colorado Companies to Watch (CCTW), and 2017 Denver Startup of the Year (Denver Chamber of Commerce)

About you

Excited about data and finding ways to use data to show the impact of our platform

Desire to be a leader in a rapidly growing team of engineers and work closely with a Product Manager to deliver value to users

Passion for technology and cutting-edge industry trends

Excellent time management skills and the ability to prioritize to meet deadlines

Bringing fresh ideas and a great attitude to the team

Ability to work creatively and analytically in a problem-solving environment

Prepared to accept critical feedback and continue to improve your engineering skills

Willingness to develop technical and business knowledge by seizing opportunities to learn

Open to working in a dog friendly environment

Roles and Responsibilities

Own and optimize the data architecture to address the data needs of our rapidly-growing business

Join a group of passionate people committed to delivering “happiness” to our users and to each other

Partner with data scientists, sales, marketing, operation, and product teams to build and deploy machine learning models that unlock growth

Build custom integrations between cloud-based systems using APIs

Write complex and efficient queries to transform raw data sources into easily accessible

models by coding across several languages such as Java, Python, and SQL

Architect, build, and launch new data models that provide intuitive analytics to the team

Build data expertise and own data quality for the pipelines you create

Experience requirements

Must have 5+ years of experience with / working with(in):

Python development, in a data-centric role

A data centric stack (Pandas (critical), Numpy, SQL, Boto3)

Relational databases, MySQL (ideally) or Postgres

Linux environments (AWS Linux 2, Ubuntu, and shell scripting)

An AWS environment with demonstrable skills with their CLI tools, Boto3 and (ideally) familiarity with S3, RDS, ECS (Fargate), MWAA (Airflow), SQS, CloudFormation, CloudWatch

CI/CD with GitHub actions or similar, PyTest or similar, automating docker builds and tests

Git

Creating data infrastructure technologies from scratch using the right tools for the job

Building out data pipelines, efficient ETL design, implementation, and maintenance

Solving complex data processing and storage challenges through scalable, fault-tolerant architecture

Very Nice to have experience with:

Docker

Apache Airflow

SQLAlchemy / ORM

FastAPI / API development in general

the Python typing system, and tools like MyPy and Pandera

dashboarding tools, e.g. Tableau or similar

AWS CDK for architecture automation

Working with cross-functional teams, including Product and Design

Nice to have knowledge of:

Observability tools, especially in the context of AWS

NoSQL data sources (e.g. Elasticsearch)

Graph databases (neo4j or neptune)

NLP-based machine learning models + some skills with tools like fast.ai/tensorflow or similar

We thank all the candidates who have shown interest in our company, but only shortlisted candidates will be contacted.

Job Link

About the Pairin

Headquarters

Denver, United States

Industry

Information Technology

Website

www.pairin.com

Job link

Jobs you might like