PROJECT DESCRIPTION::
Company is a leading capital markets fintech firm driving technological innovation and loan-level transparency in structured finance. As the world’s first end-to-end data management, reporting, and analytics platform for loan-level lending data, we are bringing unparalleled transparency and intelligence to every loan for every stakeholder.
PROJECT STACK and TEAM::
Core team of 20+ Engineers in the US and LATAM;
2 stages of interview: 1) Manager Review (30-40 min), 2) Technical Check (1.5 – 2hr).
MAIN REQUIREMENTS::
- Experienced working with data pipelines in Google Cloud Platform (GCP). You have 1+ years of experience working with large datasets in a GCP native environment. You ideally are comfortable loading and defining tables in BigQuery, working with GCP command line tools and debugging Pub/Sub messages and Cloud Functions.
- Comfortable writing production Python and SQL. You have 2+ years of experience writing production Python or Scala code to ingest, wrangle and transform data. Familiarity with Spark is a plus. You understand standard CI/CD concepts and advanced SQL.
GOOD TO HAVE::
- An excellent communicator.You have strong English proficiency and are comfortable taking requests from clients and translating them into the required technical task. You enjoy collaborating with analysts and other internal teams to proactively resolve their data issues and build tools that help make their processes more scalable and efficient.
- A self-starter. You thrive on tasks that require you to extend yourself and learn new technologies and concepts. You take proactive ownership over the tasks assigned to you and always aim to leave the codebase in a better state than you found it.
JOB RESPONSIBILITIES::
- Proactively troubleshoot our end to end data pipelines. Making sure data successfully flows through our systems to provide timely and accurate updates for more than 400 customers. You will track down web scraping, API and SFTP errors as data enters the pipeline and triage issues as the data flows back out to clients.
- Ensure clients can seamlessly connect their data to our platform. Help get new clients set up to deliver data to our platform and update existing configuration to handle additional new datasets. Manage the outgoing delivery of data that our platform has processed to downstream clients.
- Support internal analyst teams. Help ensure that the infrastructure our analyst teams rely on is working as intended. Build out automated pipelines to stage data in our data lake for analysts use cases.
- Improve our data infrastructure. Work on new features to improve the performance and scalability of our existing pipelines. Work with a cutting-edge tech stack including BigQuery, Apache Spark, Airflow and DBT.
SUMMARY::
- Long-term projects (1, 2, even 5+ years);
- Flexible working hours;
- 15 days of paid vacation, 10 days of non-paid vacation, and 10 days of national holidays;
- FREE English language classes;
- Health working environment and projects that use advanced, cutting-edge technologies;
- Career growth opportunities;
- Bonuses for a personal recommendation of new employees’ new business;
- A working environment where you communicate and work directly with the Client.
This flexibility allows developers…
- A better work-life balance
- Increased productivity
- The ability to work any time around the clock
- Reduction in commute time
- Less sick days
- Health insurance
- More time with family and friends