Big Data Engineer

About Us

Ocelot Consulting was born out of an idea that autonomy and mastery are worthy goals of good developers. We had an idea that the classical development organization could be made more efficient and pleasurable to work in if run in ways that developers valued. Ocelot works to innovate and inspire developers to try new things, applying them to client needs to solve today’s biggest problems.

We aim to give our Ocelot family challenging and rewarding work, competitive compensation and the opportunity to make their role into everything they want it to become. The establishment of a collaborative community of experts is the goal we aim to integrate all new team members into.

The Role

As a big data engineer you will develop innovative software using state of the art big data streaming architectures.

Requirements

  • Extensive experience setting up and doing development for a distributed platform, such as HBASE, Hadoop/Spark, AWS EMR, Cassandra
  • Extensive ETL experience
  • General Purpose Programming languages (Java, C, Scala, Python, Erlang, etc.)
  • Database Technology - (Postgres, Mongo, Cassandra, Elastic Search, Oracle, etc.) as well as SQL and related query languages

Nice to have

  • Data lakes
  • Experience with cloud big data technology (AWS Data Pipeline, GCP DataFlow, Azure HDInsight)
  • Experience with Business Intelligence Platforms Experience with modern JavaScript
  • API Development (proper microservice separation, HTTP verb usage)
  • DevOps – understanding of OS and Container Management
    • Distributed microservice architectures providing elasticity, redundancy, failover, and intelligent routing.
    • Docker/Kubernetes/Cloud Foundry experience

Perks

Standard Benefits Program (medical, dental, retirement, PTO, etc.)

Contact

Interested candidates please submit resumes to: [email protected].