Our client is worldwide leader in Artificial Intelligence. They have a requirement for an Spark ETL Developer with an experience working in AWS (EMR), Spark, Python, and Airflow to work in their San Francisco or Palo Alto. CA locations. This role is Contract to Hire and can work with Green Card holders or Citizens who are local to Bay Area, CA.
- This engineer will take part in developing and testing various ETL applications. Building these applications requires team work, and to deliver these solutions, the engineer will collaborate with an interdisciplinary team of experts in machine learning, data visualization & design, business process optimization, and software engineering. Candidates for this role should have extensive knowledge and experience working with Spark using Airflow, Python, Jinja templating, AWS EMR, AWS S3, AWS CLI. Ideally, this person also has the ability to tune the ETL applications under various conditions using Spark.
- Write custom ETL applications using Spark in Python/Java that follow a standard architecture.
- Success will be defined by the ability to meet requirements/acceptance criteria, delivery on-time, number of defects, and clear documentation.
- Perform functional testing, end-to-end testing, performance testing, and UAT of these applications and code written by other members of the team.
- Proper documentation of the test cases used during QA will be important for success.
- Linux – common working knowledge, including navigating through the file system and simple bash scripting
- Hadoop – common working knowledge, including basic idea behind HDFS and map reduce, and hadoop fs commands.
- Spark – how to work with RDDs and Data Frames (with emphasis on data frames) to query and perform data manipulation.
- Python/Java – Python would be ideal but a solid knowledge of Java is also acceptable.
- Source Control Management Tool - We use BitBucket
- Worked/developed in a Linux or Unix environment.
- Worked in AWS (particularly EMR).
- Has real hands-on experience developing applications or scripts for a Hadoop environment (Cloudera, Hortonworks, MapR, Apache Hadoop). By that, we mean someone who has written significant code for at least one of these Hadoop distributions.
- Has experience with ANSI SQL relational database (Oracle, SQL, Postgres, MySQL)
- Green Card on Citizens local to Bay Area, CA only
- Local candidates-only. In-person interview is required
- Term: Contract to hire
Since 2002 Maxonic has been at the forefront of connecting candidate strengths to client challenges. Our award winning, dedicated team of recruiting professionals are specialized by technology, are great listeners, and will seek to find a position that meets the long term career needs of our candidates. We take pride in the over 5,000 candidates that we have placed, and the repeat business that we earn from our satisfied clients.
Interested in Applying?
We’d love to hear from you! Please click apply with your most current resume and anything else you’d like us to know about you. You should also feel free to email Nina Schindler (firstname.lastname@example.org) or call 408-739-4900 x 123