Data Engineer
MeridianLink
Location
US Remote
Employment Type
Full time
Location Type
Remote
Department
Research & Development
Compensation
- $98.9K – $134.5K
MeridianLink runs a comprehensive background check, credit check, and drug test as part of our offer process.
It is not typical for offers to be made at or near the top of the salary range. The actual salary will be determined based on experience and other job-related factors permitted by law including geographical location.
Meridianlink offers:
Insurance coverage (medical, dental, vision, life, and disability)
Flexible paid time off
Paid holidays
401(k) plan with company match
Remote work
All compensation and benefits are subject to the terms and conditions of the underlying plans or programs, as applicable and as may be amended, terminated, or superseded from time to time.
#LI-REMOTE
We are looking for an accomplished Data Engineer to join our quickly growing Analytics team. This role will be responsible for expanding and improving our data and data pipeline architecture, monitoring those pipelines, as well as optimizing data flow and MDM for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.
The Data Engineer will support database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data pipelines to support our next generation of products and data initiatives.
RESPONSIBILITIES
•Design, develop, and operate large scale data pipelines to support internal and external consumers
•Improve and automate internal processes
•Monitor jobs and pipelines to resolve and/or alert the team
•Integrate data sources to meet business requirements
•Write robust, maintainable, well documented code
QUALIFICATIONS
• 2-4 years professional Data Engineering and Data warehousing experience
• Extremely strong implementation experience in Python, Parquet, Spark, Azure Databricks, Delta Lake, Databricks Data Warehouse. Databricks workflows, Delta Sharing and Unity Catalog.
• Experience managing data ingress (import) and egress (export) processes within Informatica ETL workflows, as well as utilizing Sigma analytics environments
• SQL development knowledge – Stored procedures, triggers, jobs, indexes, partitioning, pruning etc.
• Be able to write/debug complex SQL queries
• ETL/ELT and Data-warehousing techniques and best practices
• Experience building, maintaining, and scaling ETL/ELT processes and infrastructure
• Implementation experience with various data modelling techniques
• Implementation experience working with a BI visualization tool (Sisense is a plus)
• Experience with CI/CD tools (Preferred Gitlab, Jenkins)
• Experience with cloud infrastructure (Azure strongly preferred)
• Experience working in a fast-paced product environment, with an attitude of getting the job done with the least amount of tech debt
• Pluses for experience with UI development frameworks such as java script, Django, REACT etc.
• Knowledge of being able to work with a variety of Ingestion patterns such as API/SQL servers etc.
• Knowledge of Master Data Management
• Prior Financial industry experience a plus.
• Be able to navigate ambiguity and pivot based on business priorities with ease.
• Strong communication, negotiating and estimating skills.
• Be a team player and able to collaborate well.
Compensation Range: $98.9K - $134.5K
