Possible locations: Europe, preferable in Germany or South Africa
AWS Professional Services is a unique organization. Our customers are most-advanced companies in the world. We build for them world-class, -native IT solutions to solve real business problems and we help them get business outcomes with AWS . Our projects are often unique, one-of-a-kind endeavors that no one ever has done before.
As a Big Data & Analytics Consultant, you will architect, (re)design and build -native, business-critical Big Data solutions with our Customers. You will take advantage of the global scale, elasticity, automation and high-availability features of the AWS platform. Your will build customer solutions with Amazon Elastic Compute (EC2), Amazon Data Pipeline, Glue, Amazon S3, Amazon DynamoDB, Amazon Relational Database Service (RDS), Amazon Elastic Map Reduce (EMR), Amazon Kinesis, Amazon Redshift and other AWS services. You will work across a range of areas including web and mobile applications, enterprise applications, HPC, IoT, Big Data and Artificial Intelligence / Machine Learning and engage with technical, IT and leadership teams of our customers.
You will collaborate across the whole AWS organization, with other consultants, customer teams and partners on proof-of-concepts, workshops and complex implementation projects. You will innovate and experiment to help Customers achieve their business outcomes and deliver production-ready solutions at global scale. You will lead projects independently but also work as a member of a larger team. Your role will be key to earning customer trust.
As an Amazonian you will demonstrate the Amazon Leadership Principles, coaching and mentoring others on best practices, performance and career development.
This is a customer-facing role. When appropriate and safe, you will be required to visit our office and to travel to client locations to deliver professional services when needed.
· invent and build Big Data and Analytics solutions that solve complex problems, scale globally, guarantee performance, and enable breakthrough innovations,
· work with systems engineers, consultants and data scientists to design and build Data Analytics platforms and Data Lakes to support compute heavy data science, dashboarding, and web-facing production tooling,
· build ETL to consolidate and relate petabytes of data owned by disparate teams,
· work with customers and partners, guiding them through planning, prioritization and delivery of complex transformation initiatives, while collaborating with relevant AWS Sales and Service Teams,
· (Re)design solutions to use technologies and modern software development practices,
· help customers define their business outcomes and guide their technical architecture and investments,
· create and apply frameworks, methods, best practices and artifacts that will guide our Customers; publish and present them in large forums and across various media platforms,
· coach and mentor fellow developers how to develop high-quality code, innovate using the latest technologies, AWS services and development best practices,
· contribute to enhancing and improving AWS services.
· Bachelor’s degree, or equivalent experience, in Computer Science, Engineering, Mathematics or a related field.
· 6+ years’ experience of Data Lake/Hadoop platform implementation, including 3+ years of hands-on experience in implementation and performance tuning Hadoop/Spark implementations.
· 3+ years of data management expertise, spanning ETL processes, master data management or data management platforms experience, and integration in complex environments.
· Understanding of Apache Hadoop and the Hadoop ecosystem. Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro).
· Experience with massively-parallel-processing (MPP) models, real-time processing and analytics, data ingestion (batched and streamed) and data storage solutions.
· Understanding of database and analytical technologies in the industry including MPP and NoSQL databases, Data Warehouse design, ETL, BI reporting and Dashboard development.
· Familiarity with one or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto).
· Hands on experience delivering large-scale data warehousing and analytics projects.
· Implementation experience using OLAP databases such as Redshift, Teradata, Netezza, Vertica.
· Knowledge of BI / data visualization tools (i.e. Tableau, Spotfire, Microstrategy, Cognos)
· Experience working directly with customers, partners or third-party vendors.
· Ability to think strategically about business, product, and technical challenges in an enterprise environment.
· Experience identifying the relationships between business services, information, applications and global infrastructure assets.
· Exposure to Agile development methodologies.
· Excellent communication and presentation skills.
· Willingness and interest to collaborate and work in a team.
· Strong sense of customer focus, ownership, urgency, and drive.
· Strong communication and data presentation skills, familiarity with using data visualization tools.
· Masters or PhD in Computer Science, Physics, Engineering or Math.
· Implementation and tuning experience specifically using Amazon Elastic Map Reduce (EMR).
· Knowledge of basic AWS services (EC2, ELB, RDS, Route53 & S3, Redshift, Kinesis, Glue).
· Infrastructure automation through DevOps scripting (E.g. shell, Python, Ruby, Powershell).
· Track record of implementing AWS services in a variety of distributed computing, enterprise environments.
· Experience using machine learning libraries, such as scikit-learn, caret, mlr, mllib.
· Use of AWS services in distributed environments with Microsoft, IBM, Oracle, HP, etc.
We invest heavily in our team by continuously offering learning opportunities, sharing knowledge internally across all technical teams in AWS, and working on (customer & internal) projects that will broaden and deepen your technical expertise and business acumen.
If you have an entrepreneurial spirit, are eager to deliver results, are deeply technical, highly innovative, and a voracious learner, it is you, who we are looking for.