Location: Santa Clara, CA, United States
Date Posted: 10-14-2017
Position Role/Tile: SYSTEM ARCHITECT
Location: Santa Clara, CA.

This is an exciting role at Client where you will be delivering cutting-edge Opensource solutions to FTSE 100 customers. The Systems Architect forms part of the Professional Services organization. Working across a diverse range of industries and projects enabling our customers on their Big Data journey. Engaging with customers from Proof of Concept (POC) stages through to implementation of complex distributed production environments. You will work collaboratively with customers to optimize performance, develop reference architectures and form part of a team that will foster a long-standing collaborative relationship with our customer group.
  • Drive POCs with customers to successful completion
  • Analyze complex distributed production deployments, and make recommendations to optimize performance
  • Help develop reference Hadoop architectures and configurations
  • Write and produce technical documentation, knowledge base articles
  • Work directly with prospective customers' technical resources to devise and recommend solutions based on the understood requirements
  • Participate in the pre- and post- sales process, helping both the sales and product teams to interpret customers' requirements
  • Work closely with Client's teams at all levels to ensure rapid response to customer questions and project
  • Playing an active within the Open Source Community
  • More than five years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions
  • 2+ years designing and deploying 3 tier architectures or large scale Hadoop solutions
  • Experience working with Apache Hadoop including: Knowledge on how to create and debug Hadoop jobs
  • Ability to understand big data use-cases, and recommend standard design patterns commonly used in Hadoop-based deployments
  • Ability to understand and translate customer requirements into technical requirements
  • Experience implementing data transformation and processing solutions using Apache PIG
  • Experience designing data queries against data in the HDFS environment using tools such as Apache Hive
  • Experience implementing MapReduce jobs
  • Experience setting up multi-node Hadoop clusters
  • Strong experience implementing software and/or solutions in the enterprise Linux or Unix environment
  • Strong understanding of the Java ecosystem and enterprise offerings, including debugging and profiling tools (jconsole), logging and monitoring tools (log4j, JMX), and security offerings (Kerberos/SPNEGO).
  • Familiarity with scripting tools such as bash shell scripts, Python and/or Perl
  • Demonstrable experience using R and the algorithms provided by Mahout
  • Certifications are an advantage but not essential

Central Business Solutions, Inc,
37600 Central Ct.
Suite #214
Newark, CA 94560.
Central Business Solutions, Inc(A Certified Minority Owned Organization)
Checkout our excellent assessment tool:
Checkout our job board :
Central Business Solutions, Inc
37600 Central Court Suite 214 Newark CA, 94560
Phone: (510)-713-9900, 510-573-5500 Fax: (510)-740-3677
this job portal is powered by CATS