Specialist Solutions Architect - Tokyo, 日本 - Databricks

    Databricks
    Default job background
    説明
  • DiscoverCustomersPartners
  • Databricks PlatformIntegrations and DataPricingOpen Source
  • Databricks for IndustriesCross Industry SolutionsMigration & DeploymentSolution Accelerators
  • Training and CertificationEventsBlog and PodcastsGet HelpDive Deep
  • CompanyCareersPressSecurity and Trust
  • Ready to get started?
  • Specialist Solutions Architect Tokyo, Japan

    FEQ125R106

    As a Specialist Solutions Architect (SSA), you will guide customers in building big data solutions on Databricks that span a large variety of use cases. These are customer-facing roles, working with and supporting the Solution Architects, requiring hands-on production experience with Apache Spark and expertise in other data technologies. SSAs help customers through design and successful implementation of essential workloads while aligning their technical roadmap for expanding the usage of the Databricks Lakehouse Platform. As a deep go-to-expert reporting to the Specialist Field Engineering Manager, you will continue to strengthen your technical skills through mentorship, learning, and internal training programs and establish yourself in an area of specialty - whether that be performance tuning, machine learning, industry expertise, or more.

    The impact you will have:

  • Provide technical leadership to guide strategic customers to successful implementations on big data projects, ranging from architectural design to data engineering to model deployment
  • Architect production level workloads, including end-to-end pipeline load performance testing and optimisation
  • Provide technical expertise in an area such as data management, cloud platforms, data science, machine learning, or architecture
  • Assist Solution Architects with more advanced aspects of the technical sale including custom proof of concept content, estimating workload sizing, and custom architectures
  • Improve community adoption (through tutorials, training, hackathons, conference presentations)
  • Contribute to the Databricks Community
  • What we look for:

  • 2-3 years experience in a customer-facing technical role with expertise in Data Science/Machine Learning with model selection, model lifecycle, hyper parameter tuning, model serving, deep learning.
  • You have experience in the design and implementation of big data technologies such as Apache Spark/Delta, Hadoop, NoSQL, MPP, OLTP, and OLAP.
  • You've maintained and extended production data systems to evolve with complex needs.
  • You have experience in either Python, R, Scala or Java at a production level.
  • You have specialty expertise in at least one of the following areas:
  • Experience scaling big data workloads that are performant and cost-effective.
  • Experience with Development Tools for CI/CD, Unit and Integration testing, Automation and Orchestration, REST API, BI tools and SQL Interfaces.
  • Experience designing data solutions on cloud infrastructure and services, such as AWS, Azure, or GCP using best practises in cloud security and networking.
  • Experience with ML concepts covering Model Tracking, Model Serving and other aspects of productionizing ML pipelines in distributed data processing environments like Apache Spark, using tools like MLflow.
  • You have a degree in a quantitative discipline (Computer Science, Applied Mathematics, Operations Research).
  • You are a native level Japanese speaker who is able to communicate with folks from overseas in English.
  • Benefits

  • Benefits allowance
  • Equity awards
  • Paid parental leave
  • Gym reimbursement
  • Annual personal development fund
  • Work headphones reimbursement
  • Business travel insurance
  • Mental wellness resources