TRM is on a mission to build a safer financial system for billions of people. We deliver a blockchain intelligence data platform to financial institutions, crypto companies, and governments to fight cryptocurrency fraud and financial crime. We consider our business — and our profit — as a way to move towards our mission sustainably and at scale.
The Core Data Ingest team is comprised of engineers who work cross-functionally to build data pipelines that enable us to ingest valuable information to further enhance the attribution displayed to our customers in our product. As a Software Engineer on the Core Data Ingestion team, you will be responsible for executing mission-critical systems and data services that ingest and analyze blockchain transaction activity at petabyte scale, and ultimately work to build a safer financial system for billions of people.
The impact you’ll have here:
- Build highly reliable data services to integrate with dozens of blockchains.
- Develop complex ETL pipelines that transform and process petabytes of structured and unstructured data in real-time.
- Design and architect intricate data models for optimal storage and retrieval to support sub-second latency for querying blockchain data.
- Collaborate across departments, partnering with data scientists, backend engineers, and product managers to design and implement novel data models that enhance TRM’s products.
What we’re looking for:
- Write high-quality code. We mostly work in Python. However, languages can be learned: we care much more about your general engineering skill than knowledge of a particular language or framework.
- Versatility. Experience across the entire spectrum of data engineering, including:
- Data stores (e.g., ClickHouse, ElasticSearch, Postgres, Redis, and Neo4j)
- Data pipeline and workflow orchestration tools (e.g., Airflow, DBT, Luigi, Azkaban, Storm)
- Data processing technologies and streaming workflows (e.g., Spark, Kafka, Flink)
- Deployment and monitoring infrastructure in public cloud platforms (e.g., Docker, Terraform, Kubernetes, Datadog)
- Loading, querying and transforming large data sets
- You're comfortable working with noisy, dirty, and unstructured data to cleanse, scrape and convert it into structured data
- A high degree of initiative and ownership, combined with the ability to navigate ambiguity and adapt quickly to change.
- Exceptional ability to structure problems and identify the most critical issues to prioritize.
- Ability to communicate complex ideas effectively to both technical and non-technical audiences, verbally and in writing.
- Experience working collaboratively in a cross-functional environment with a diverse group of people at all levels in an organization
- Passion for building a safer financial system for billions of people.
- Interest in virtual currencies, applications, and their use in financial networks and blockchain analysis is a plus.
#J-18808-Ljbffr