Business Area:
Professional Services
Job Description:
The Professional Services team in Cloudera works with some of the most exciting distributed data problems at private and public sector organizations. As a team member,
You get to engage with new customer prospects, covering topics such as technology strategy and business objectives, through to production implementations at large multi-cluster customers.
You will own, evangelize and collaborate with our customers, devise reference enterprise data architectures.
You will get an opportunity to form part of a team that will foster a long-standing relationship with our customers building strong trusted advisor relationships.
If this excites you, come join us and be part of the future of Data!
As A Senior Solutions Architect on our Public Sector team, you will:
Work directly with Federal customer’s technical resources to devise and recommend solutions based on the understood requirements
Analyze complex distributed production deployments, and make recommendations to optimize performance
Help design and implement Big Data architectures and configurations to enable our customers
Work closely with Cloudera’s teams at all levels to help ensure the success of project consulting engagements with customer
Drive projects with customers to successful completion
Write and produce technical documentation, knowledge base articles
Participate in the pre-and post- sales process, helping both the sales and product teams to develop customers’ requirements
Attend speaking engagements when needed
We're excited about you if you have:
TS/SCI clearance with Full Scope Polygraph
4+ years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions within a Federal or IC agency
Experience designing and deploying production large-scale Hadoop solutions
Ability to understand and translate customer requirements into technical requirements
Experience designing data queries in a Hadoop environment using tools such as Apache Hive, Apache Phoenix, Apache Spark or others.
Experience installing and administering multi-node Hadoop clusters
Strong experience implementing solutions in an Enterprise Linux or Unix environment
Strong understanding of various enterprise security solutions such as LDAP and/or Kerberos
Good understanding of network configuration, devices, protocols, speeds and optimizations
Knowledge of programming and scripting languages
Strong understanding with using network-based APIs, preferably REST/JSON or XML/SOAP
Knowledge of database design, administration and Data Modeling with star schema.
Experience implementing big data use-cases, understanding of standard design patterns commonly used in Hadoop-based deployments.
You may also have:
Experience with structured programming languages such as Java, Python, etc.
Experience using streaming centric solutions such as Kafka or Flink
Hands-on experience with Apache NiFi or Cloudera CFM.
Experience with software automation technologies such as Ansible, etc.
What you can expect from us:
Generous PTO Policy
Support work life balance with
Unplugged Days
Flexible WFH Policy
Mental & Physical Wellness programs
Phone/Internet Reimbursement program
Access to Continued Career Development
Comprehensive Benefits
Competitive Packages
Employee Resource Groups
Cloudera is an Equal Opportunity / Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.
Management Level:
9 Individual Contributor