We are looking for a seasoned Principal Software Engineer for our client to shape the foundational architecture behind our Unified API platform. This role centers on building high-performance distributed systems, powering data-intensive services, and enabling seamless, secure integrations across modern IT and security ecosystems.
This is a fully remote position (U.S. only) and best suited for engineers who excel in fast-moving startup environments where autonomy, rapid iteration, and strong technical judgment are essential.
In this role, you will serve as a technical leader responsible for advancing our core backend platform. You’ll design and deliver complex systems—from microservices to large-scale data pipelines—and play a pivotal part in scaling our real-time infrastructure. Your work will directly influence how products across industries connect, integrate, and exchange data.
Lead the design and development of distributed microservices and high-throughput backend systems.
Drive improvements in system performance, low-latency processing, and large-scale reliability.
Architect data storage strategies including data lake and warehouse solutions supporting analytics and ML.
Build and optimize real-time streaming pipelines capable of processing massive data volumes.
Deploy and operationalize machine learning models within production environments.
Collaborate closely with the broader engineering and product teams to deliver platform-wide enhancements.
We are in search of an engineer with deep experience building systems from scratch, ideally in early-stage startup settings. Candidates should bring a history of architecting and shipping highly reliable, technically sophisticated products.
10+ years of professional experience in software engineering.
Bachelor’s or Master’s in Computer Science or equivalent experience.
Demonstrated expertise in architecting and building distributed systems, backend services, and large-scale data infrastructure.
Strong capabilities in performance tuning, scalability, and reliability engineering.
Hands-on experience designing data lakes and data warehouses for analytics or ML applications.
Proven ability to create and maintain real-time data streaming pipelines handling terabyte-scale workloads.
Background in integrating machine learning systems into production workflows.
Proficiency with backend technologies such as Golang, Git, Kubernetes, and Opentelemetry.
Significant experience building APIs and web services.
Strong knowledge of data processing, databases, and cloud-native architectures.
Deep familiarity with AWS, GCP, or Azure services.
Extensive use of Kubernetes, IaC, CI/CD pipelines, and GitOps practices.
Exceptional communication skills, including written, verbal, and interpersonal abilities.
Prior experience in cybersecurity companies is a major plus.
* Must be authorized to work in the U.S.