prodapt

prodapt

Forward Deployed Engineer – Data & AI Presales

Company

prodapt

Role

Forward Deployed Engineer – Data & AI Presales

Job type

-

Posted

8 hours ago

Salary

Not disclosed by employer

Job description

Overview Forward Deployed Engineer – Data & AI Presales Location India (Bangalore / Hyderabad / Chennai / Pune — Flexible) Role Purpose Act as a Forward Deployed Engineer (FDE) bridging the gap between field presales and product/platform capabilities for large-scale data modernisation and AI transformation deals. This is a customer-facing, hybrid-skillset role that combines software development, applied AI engineering, data science, and solution architecture to rapidly prototype winning solutions, demonstrate platform value with real-world use cases, and feed field intelligence back into the organisation's product and accelerator roadmap. The FDE operates at the intersection of engineering and presales — building what needs to be built, demoing what needs to be seen, and shaping what needs to be sold. The FDE DNA Customer-Facing Engineer — Works directly with clients to understand needs and demonstrate how platforms and AI solve real-world problems. Rapid Prototyper — Builds quick proofs-of-concept, custom apps, agents, dashboards, and integrations to validate ideas before full productisation. Bridge Between Field & Accelerator — Provides structured feedback from customer engagements to shape the product and accelerator roadmap (e.g., Agent Craft, Agent Factory). Hybrid Skillset — Combines coding, solution architecture, data science, and product thinking in one role. Future Oriented — Focused on testing and developing forward-looking use cases (agentic AI, multimodal, autonomous workflows) that may influence official offerings later. Responsibilities Key Responsibilities Customer Engagement & Presales Solutioning Participate in requirement discovery workshops, stand-ups with customer teams, and problem framing sessions as part of deal pursuits. Align on problem statements, success metrics, and AI readiness assessments with client stakeholders. Lead AI art-of-the-possible sessions and executive demos that translate platform capabilities into business outcomes. Develop technical proposals, solution architectures, effort estimations, and competitive differentiators under tight deal timelines. Support the Americas Presales Lead across RFPs, proactive pursuits, and strategic multi-year transformation deals. Rapid Prototyping & Applied AI Engineering Build quick PoCs — AI agents, intelligent apps, dashboards, and integrations — to validate ideas and demonstrate value during deal cycles. Integrate APIs, data sources, LLM services, and platform capabilities into working prototypes. Demo early mockups and functional prototypes to gather client feedback and iterate rapidly. Design end-to-end AI solution architectures covering GenAI (LLMs, RAG, AI agents, prompt engineering, fine-tuning), classical ML (classification, regression, clustering, time series), NLP, and computer vision. Build interactive demo applications using Streamlit, Gradio, FastAPI, React/Next.js to showcase AI solutions in client workshops. Leverage and contribute to organisational accelerators (Agent Craft, Agent Factory, AI tools) to speed up presales delivery. Minimum one year of dedicated AI presales experience is required — crafting and demonstrating AI solutions for deal pursuits, not just delivery. Data Science, Data Engineering & Exploration Perform rapid data exploration, profiling, and quality assessment on client datasets to evaluate AI readiness and opportunity sizing. Apply data science techniques to support presales — EDA, statistical modelling, hypothesis validation, feature engineering, and rapid model prototyping. Design modern data architectures bridging engineering and science — lakehouse patterns, feature stores, real-time pipelines, and governed data products. Build data pipelines, transformation layers, and data wrangling workflows that feed AI/ML workloads. Set up pipelines and databases, apply evaluation matrices (accuracy, latency, bias) to validate solution approaches. Work across Databricks, Snowflake, and Google Cloud Data & Analytics (BigQuery, Vertex AI, Dataflow, Dataplex). Software Engineering & DevOps Excellence Bring production-grade software engineering discipline to all presales assets — clean architecture, TDD, documentation, reproducibility, and scalability. Develop across the full stack: backend (Python, FastAPI, REST APIs), frontend (React/Next.js, streaming UIs), and DevOps (Docker, Terraform, CI/CD). Apply MLOps and LLMOps practices including experiment tracking (MLflow, Weights & Biases), model registries, evaluation harnesses with quality gates in CI, tracing (OpenTelemetry), and monitoring (Prometheus/Grafana). Implement guardrails, safety policies, and red-teaming approaches for AI solutions. Build and maintain reusable demo environments, accelerators, estimation templates, and proof-of-concept kits. Data Modernisation for AI Design migration strategies from legacy platforms (Oracle, Teradata, Netezza) to AI-ready cloud architectures. Articulate the modernisation journey — Data Warehouse → Lakehouse → AI Platform — with clear value at each stage. Ensure modernised architectures are optimised for AI workloads: feature engineering, training pipelines, model serving, and feedback loops. Integrate AI-assisted modernisation techniques including automated code conversion, intelligent data mapping, and AI-powered testing. Thought Leadership & Digital Presence This is a non-negotiable requirement. The candidate must have an active and demonstrable digital presence focused on AI and Data, evidenced by a meaningful combination of the following: Published technical blogs, articles, or newsletters on AI/ML and data topics (Medium, Substack, personal blog, LinkedIn articles). Active conference speaking — talks, panels, or workshops at industry events, meetups, or webinars. Open-source contributions or publicly available projects on GitHub related to AI/ML. Strong LinkedIn or X (Twitter) presence with regular, substantive posts on AI trends, techniques, and industry perspectives. YouTube, podcast, or video content on AI topics. Contribute to organisational thought leadership by publishing POVs, whitepapers, solution frameworks, and competitive analyses. Represent the organisation at partner events (Databricks, Snowflake, Google Cloud), industry conferences, and client innovation days. Stay at the cutting edge of AI — foundation models, agentic AI, multimodal AI, AI safety, autonomous agents — and translate these into enterprise-relevant presales assets. Bridge Between Field & Accelerators Provide structured, actionable feedback from customer engagements to shape the accelerator roadmap. Identify patterns across deals to propose new reusable accelerators, reference architectures, and demo assets. Collaborate with platform engineering, LLM engineering, and data prep teams to industrialise field-proven solutions. Work with governance teams on security, ethics, and policy awareness for AI deployments. Train and enable client business users to become AI champions and agent builders as part of presales engagement. Requirements Required Skills & Experience Background & Experience 6–12 years of experience with a strong foundation in data engineering or data science, combined with solid software engineering skills across frontend, backend, and DevOps. Minimum 1 year of dedicated AI presales experience — designing, prototyping, and demonstrating AI solutions as part of deal pursuits, client workshops, or strategic presales engagements. Delivery-only AI experience does not qualify. Proven ability to rapidly prototype end-to-end AI/ML solutions — from data exploration and feature engineering through model development, deployment, and interactive demos. Hands-on experience across at least two of three core platforms: Databricks, Snowflake, and Google Cloud (BigQuery, Vertex AI, Dataflow). Technical Skills Applied AI Engineering: Deep hands-on knowledge of GenAI (LLMs, RAG, prompt engineering, fine-tuning, AI agents, structured outputs, guardrails), classical ML, NLP, and computer vision. Proficiency with frameworks including PyTorch, TensorFlow, Hugging Face, LangChain, LlamaIndex, and OpenAI APIs. Data Science: EDA, statistical modelling, hypothesis testing, feature engineering, and rapid prototyping. Proficiency in Python and SQL is essential; R is a plus. Data Engineering: Pipeline design, ETL/ELT, streaming architectures, data modelling, and feature store design. Experience with Spark, Kafka, dbt, Airflow, or equivalent. Software Development: Full-stack capability — Python/FastAPI backend, React/Next.js frontend, clean architecture, TDD. The candidate must write production-grade, testable, deployable code — not just notebook-level prototypes. DevOps & MLOps/LLMOps: Docker, Terraform, CI/CD, experiment tracking (MLflow, W&B), evaluation harnesses (RAGAS, BLEU/ROUGE), tracing (OpenTelemetry), monitoring (Prometheus/Grafana), and red-teaming. Thought Leadership & Digital Presence Established and verifiable digital presence focused on AI is mandatory. Candidates must provide links to their published work, profiles, or portfolios as part of the application. Demonstrated ability to simplify complex AI topics for business and technical audiences. Communication & Presales Skills Excellent communication — ability to present to CxO stakeholders, lead workshops, run demos, and write compelling proposals. Strong problem framing and consulting skills — can translate ambiguous client needs into structured solution approaches. Experience creating solution architectures, technical decks, and effort estimations for large deals ($5M+). Comfort working with Americas clients across time zones (IST evening overlap required). Preferred Qualifications Certifications in Databricks (ML Associate/Professional), Google Cloud (Professional ML Engineer / Data Engineer), Snowflake (SnowPro), or cloud-native AI certifications. Published speaker at recognised industry conferences (Data + AI Summit, Google Cloud Next, Snowflake Summit, PyCon, or equivalent). Active open-source contributor with a visible GitHub profile. Experience building and contributing to internal accelerator platforms (agent frameworks, use case factories, evaluation suites). Familiarity with AI safety, responsible AI frameworks, and AI governance. Exposure to telecom, BFSI, or enterprise verticals. Experience with Japanese enterprise clients or cross-cultural engagement is a significant plus. Key Responsibilities Customer Engagement & Presales Solutioning Participate in requirement discovery workshops, stand-ups with customer teams, and problem framing sessions as part of deal pursuits. Align on problem statements, success metrics, and AI readiness assessments with client stakeholders. Lead AI art-of-the-possible sessions and executive demos that translate platform capabilities into business outcomes. Develop technical proposals, solution architectures, effort estimations, and competitive differentiators under tight deal timelines. Support the Americas Presales Lead across RFPs, proactive pursuits, and strategic multi-year transformation deals. Rapid Prototyping & Applied AI Engineering Build quick PoCs - AI agents, intelligent apps, dashboards, and integrations - to validate ideas and demonstrate value during deal cycles. Integrate APIs, data sources, LLM services, and platform capabilities into working prototypes. Demo early mockups and functional prototypes to gather client feedback and iterate rapidly. Design end-to-end AI solution architectures covering GenAI (LLMs, RAG, AI agents, prompt engineering, fine-tuning), classical ML (classification, regression, clustering, time series), NLP, and computer vision. Build interactive demo applications using Streamlit, Gradio, FastAPI, React/Next.js to showcase AI solutions in client workshops. Leverage and contribute to organisational accelerators (Agent Craft, Agent Factory, AI tools) to speed up presales delivery. Minimum one year of dedicated AI presales experience is required - crafting and demonstrating AI solutions for deal pursuits, not just delivery. Data Science, Data Engineering & Exploration Perform rapid data exploration, profiling, and quality assessment on client datasets to evaluate AI readiness and opportunity sizing. Apply data science techniques to support presales - EDA, statistical modelling, hypothesis validation, feature engineering, and rapid model prototyping. Design modern data architectures bridging engineering and science - lakehouse patterns, feature stores, real-time pipelines, and governed data products. Build data pipelines, transformation layers, and data wrangling workflows that feed AI/ML workloads. Set up pipelines and databases, apply evaluation matrices (accuracy, latency, bias) to validate solution approaches. Work across Databricks, Snowflake, and Google Cloud Data & Analytics (BigQuery, Vertex AI, Dataflow, Dataplex). Software Engineering & DevOps Excellence Bring production-grade software engineering discipline to all presales assets - clean architecture, TDD, documentation, reproducibility, and scalability. Develop across the full stack: backend (Python, FastAPI, REST APIs), frontend (React/Next.js, streaming UIs), and DevOps (Docker, Terraform, CI/CD). Apply MLOps and LLMOps practices including experiment tracking (MLflow, Weights & Biases), model registries, evaluation harnesses with quality gates in CI, tracing (OpenTelemetry), and monitoring (Prometheus/Grafana). Implement guardrails, safety policies, and red-teaming approaches for AI solutions. Build and maintain reusable demo environments, accelerators, estimation templates, and proof-of-concept kits. Data Modernisation for AI Design migration strategies from legacy platforms (Oracle, Teradata, Netezza) to AI-ready cloud architectures. Articulate the modernisation journey - Data Warehouse → Lakehouse → AI Platform - with clear value at each stage. Ensure modernised architectures are optimised for AI workloads: feature engineering, training pipelines, model serving, and feedback loops. Integrate AI-assisted modernisation techniques including automated code conversion, intelligent data mapping, and AI-powered testing. Thought Leadership & Digital Presence This is a non-negotiable requirement. The candidate must have an active and demonstrable digital presence focused on AI and Data, evidenced by a meaningful combination of the following: Published technical blogs, articles, or newsletters on AI/ML and data topics (Medium, Substack, personal blog, LinkedIn articles). Active conference speaking - talks, panels, or workshops at industry events, meetups, or webinars. Open-source contributions or publicly available projects on GitHub related to AI/ML. Strong LinkedIn or X (Twitter) presence with regular, substantive posts on AI trends, techniques, and industry perspectives. YouTube, podcast, or video content on AI topics. Contribute to organisational thought leadership by publishing POVs, whitepapers, solution frameworks, and competitive analyses. Represent the organisation at partner events (Databricks, Snowflake, Google Cloud), industry conferences, and client innovation days. Stay at the cutting edge of AI - foundation models, agentic AI, multimodal AI, AI safety, autonomous agents - and translate these into enterprise-relevant presales assets. Bridge Between Field & Accelerators Provide structured, actionable feedback from customer engagements to shape the accelerator roadmap. Identify patterns across deals to propose new reusable accelerators, reference architectures, and demo assets. Collaborate with platform engineering, LLM engineering, and data prep teams to industrialise field-proven solutions. Work with governance teams on security, ethics, and policy awareness for AI deployments. Train and enable client business users to become AI champions and agent builders as part of presales engagement. Required Skills & Experience Background & Experience 6-12 years of experience with a strong foundation in data engineering or data science, combined with solid software engineering skills across frontend, backend, and DevOps. Minimum 1 year of dedicated AI presales experience - designing, prototyping, and demonstrating AI solutions as part of deal pursuits, client workshops, or strategic presales engagements. Delivery-only AI experience does not qualify. Proven ability to rapidly prototype end-to-end AI/ML solutions - from data exploration and feature engineering through model development, deployment, and interactive demos. Hands-on experience across at least two of three core platforms: Databricks, Snowflake, and Google Cloud (BigQuery, Vertex AI, Dataflow). Technical Skills Applied AI Engineering: Deep hands-on knowledge of GenAI (LLMs, RAG, prompt engineering, fine-tuning, AI agents, structured outputs, guardrails), classical ML, NLP, and computer vision. Proficiency with frameworks including PyTorch, TensorFlow, Hugging Face, LangChain, LlamaIndex, and OpenAI APIs. Data Science: EDA, statistical modelling, hypothesis testing, feature engineering, and rapid prototyping. Proficiency in Python and SQL is essential; R is a plus. Data Engineering: Pipeline design, ETL/ELT, streaming architectures, data modelling, and feature store design. Experience with Spark, Kafka, dbt, Airflow, or equivalent. Software Development: Full-stack capability - Python/FastAPI backend, React/Next.js frontend, clean architecture, TDD. The candidate must write production-grade, testable, deployable code - not just notebook-level prototypes. DevOps & MLOps/LLMOps: Docker, Terraform, CI/CD, experiment tracking (MLflow, W&B), evaluation harnesses (RAGAS, BLEU/ROUGE), tracing (OpenTelemetry), monitoring (Prometheus/Grafana), and red-teaming. Thought Leadership & Digital Presence Established and verifiable digital presence focused on AI is mandatory. Candidates must provide links to their published work, profiles, or portfolios as part of the application. Demonstrated ability to simplify complex AI topics for business and technical audiences. Communication & Presales Skills Excellent communication - ability to present to CxO stakeholders, lead workshops, run demos, and write compelling proposals. Strong problem framing and consulting skills - can translate ambiguous client needs into structured solution approaches. Experience creating solution architectures, technical decks, and effort estimations for large deals ($5M+). Comfort working with Americas clients across time zones (IST evening overlap required). Preferred Qualifications Certifications in Databricks (ML Associate/Professional), Google Cloud (Professional ML Engineer / Data Engineer), Snowflake (SnowPro), or cloud-native AI certifications. Published speaker at recognised industry conferences (Data + AI Summit, Google Cloud Next, Snowflake Summit, PyCon, or equivalent). Active open-source contributor with a visible GitHub profile. Experience building and contributing to internal accelerator platforms (agent frameworks, use case factories, evaluation suites). Familiarity with AI safety, responsible AI frameworks, and AI governance. Exposure to telecom, BFSI, or enterprise verticals. Experience with Japanese enterprise clients or cross-cultural engagement is a significant plus.

Resume ExampleCover Letter Example

Explore more

Similar jobs