Back to listing

Senior Hadoop Engineer, Guadalajara – Mexico
Guadalajara, Mexico

Date posted: 01.08.2022

Job type: Full time

Guadalajara, Mexico

Remote work

Full time

Job perks: Cool tech stack; Multicultural team; Professionalism

About the project

How we hire: 
At Pentalog, excellence is what you'll do. We're guided by a mission to positively impact the software development world.

One of the largest user-generated platforms dedicated to travelling is in constant need of talented engineers. 

Now, our client is looking to staff a pod of highly skilled senior level Hadoop Engineers.  The ideal candidate for this role needs to have a passion for big data operations, a deep understanding of both software development and the environment, as well as aexperience with multiple technologies in the Hadoop ecosystem. 
 

Job requirements

  • Bachelor’s degree in Computer Science or related field;
  • 5+ years’ experience in commercial software development;
  • Experience with Hadoop, Hive, or MPP database platforms and experience running these environments supporting multiple internal teams;
  • Ability to break down complex problems into simple solutions;
  • In-depth experience and hands-on knowledge of Linux, shell scripting, and related open source technologies such as Apache Hadoop, Kafka, Samza, and Yarn.
  • Strong understanding of distributed systems and distributed computation.
  • Deployment automation experience with scripting, Puppet, Ansible, etc.
  • Excellent verbal and written communication skills;
  • Advanced English language skills.
     

Responsibilities

  • Provide hands-on operational support for a high-SLA Hadoop cluster running Cloudera and participate in a pager rotation;
  • Suggest and participate in preventative maintenance activities that will keep the cluster healthy and operational per our SLA requirements, such as tuning in-house tools for identifying unused or un-queried data and tuning in-house tools for managing the Small Files problem on Hadoop;
  • Deploy features that will make it easier to service and operate the cluster, such as using Ansible to automate taking nodes in and out of service to prevent memory leaks;
  • With collaboration of in-house data engineering leads, help with the effort to migrate on-premises Hadoop workloads to our new cloud data platform.
     

Extra skills

  • Vast experience with Apache Hadoop ecosystem applications: Hadoop, Hive, Oozie, Presto, Hue, Spark, Zeppelin etc;
  • Expert level knowledge and experience with SQL, relational database engines, or MPP databases;
  • Strong coding experience in scripting languages (Bash, Python);
  • Preference for coding experiment with at least one statically typed OO language like Java, C++ or C#;
  • Strong Linux experience (proficiency with tools like lsof, iostat, top and ability to identify resource bottlenecks etc)
     

Benefits

  • Highly competitive salary;
  • The option of full-remote work;
  • Free pass to learning platforms;
  • Very attractive benefits above the law;
  • Free pass to learning platforms;
  • Enrollment into our active community (Meetups, PentaBar tech events, Lunch & Learn, PentaSport events);
  • The possibility of applying to international positions within Pentalog.
     

About Pentalog

As a leading European Software Services company operating internationally in France, Romania, Germany, Poland, Moldova, UK, Vietnam, Mexico and USA, we employ over 1,600 engineers and IT experts who work in a very dynamic, multicultural working environment.

At Pentalog, your talents & ambitions are recognized and rewarded; we offer plenty of opportunities to develop, both individually, as well as a professional, and we reward our collaborators who understand the importance of self-improvement.

Super humans