Argomenti trattati
The role described here is for an engineer who will build and maintain systems that collect, normalize, and serve security-related data at scale. You will be expected to deliver robust code and architectures inside google cloud‘s production environment while collaborating with analysts and data scientists to create defensive capabilities. The position values practical experience in both software testing and consumer electronics tooling, and it expects familiarity with distributed systems, machine learning, and applied natural language techniques. The successful candidate should be comfortable turning ambiguous, real-world web signals into reliable, structured artifacts.
This opportunity requires a combination of hands-on engineering and research-oriented thinking. Candidates must show proven experience in automated collection and extraction workflows and be able to scale services using distributed databases and modern production practices. You will be responsible for designing high-throughput APIs, managing data pipelines that digest noisy content, and integrating ML or LLM capabilities to convert unstructured inputs into usable intelligence. Strong collaborative instincts and an owner mentality are essential to align technical work with operational security needs.
Who we are looking for
Minimum qualifications include a bachelor’s degree or equivalent practical experience and at least five years in software test engineering, consumer electronics, or internal tooling. Prior work involving research into Artificial intelligence, data mining, natural language processing, image classification, spam mitigation, or related fields is also expected. These foundations enable you to design resilient systems and evaluate models that operate under real-world constraints. Experience with building end-to-end solutions that move from prototype to production will make you effective quickly in this role.
Preferred candidates will have hands-on knowledge of building and operating distributed systems within large-scale production environments and familiarity with Spanner or similar globally-distributed databases. A solid understanding of network security, threat intelligence, and operational security practice is valuable when working with sensitive or dynamic data sources. You should also be adept at sanitizing and interpreting messy, unstructured content using a blend of regex, heuristics, and LLM integrations to produce well-structured outputs for downstream use.
Responsibilities and technical expectations
You will design, implement, and operate APIs and ingestion systems that support data flow across a broader threat intelligence architecture. This includes creating reliable, scalable APIs and engineering automated web collection tools and scrapers that target surface web, deep web, dark web, and messaging platforms. Your work will ensure continuous monitoring of adversary communications and convert raw captures into normalized records. Emphasis will be placed on building systems that tolerate noise and adversarial behavior while prioritizing data quality and availability.
Data pipeline design and ML integration
Architecting pipelines is central to the role: you will transform noisy web content into structured intelligence through applied machine learning and LLM-driven parsers. This involves feature extraction, normalization, entity resolution, and enrichment steps that support analyst workflows and automated defenses. The position requires operationalizing models and ensuring that inference and preprocessing scale inside production, leveraging distributed databases and resilient compute designs to maintain high availability and low latency.
Collaboration, compliance, and culture
This role demands close partnership with threat analysts, data scientists, and internal security teams to convert collection capabilities into high-fidelity defensive tools. You’ll participate in cross-functional planning, prioritize features that deliver measurable protection, and document interfaces and behaviors clearly. All applicant data and materials are handled per Google’s Applicant and Candidate Privacy Policy, and the work itself must follow operational security guidance when interacting with sensitive sources. Clear, respectful communication and the ability to translate technical trade-offs to non-engineering stakeholders are vital.
Policy, inclusion, and practical notes
Google is an equal opportunity and affirmative action employer committed to building an inclusive workforce that reflects its users. Reasonable accommodations are available for applicants who need them; candidates should request support through the stated accommodations form. English proficiency is required to enable global collaboration unless a job posting explicitly states otherwise. Recruitment agencies should note that unsolicited resumes are not accepted; Google does not pay fees for agency-submitted candidates. These operational details ensure transparent hiring and fair consideration for all applicants.

