Senior Big Data Engineer
ESO is a rapidly growing technology company passionate about improving community health and safety through the power of data. We provide software applications, interoperability and data management solutions to emergency medical services, fire departments and hospitals.
We’re small enough to be nimble and fun, but big enough to be a great, stable place to work. We serve more than 14,000 customers out of our offices in Austin, Texas and Des Moines, Iowa.
About the role
ESO is rapidly expanding its analytics and data science capabilities to align with emerging big data technologies. We are looking for a Senior Engineer with prior experience leveraging JVM languages and tools in the Apache ecosystem to accelerate the development of our cloud-based analytics products.
We are an Agile development shop and regularly demo our work to project stakeholders. To support this methodology, we practice continuous integration, embrace open source software, and empower our developers to make informed technology and product decisions.
More about you
You’ll have expressed expert capabilities in contemporary programming languages with Java and/or Scala preferred as a base language. As a member of ESO’s Analytics team, you should bring a hunger and curiosity for learning new technologies and programming paradigms that generate momentum and excitement around ESO’s research and data product capabilities. In this position, you can expect to develop streaming event processing pipelines, complex analytical models and data APIs built atop highly available and scalable platform architectures.
It’s noteworthy to point out that the technical criteria is not intended as an exclusive, filtering set of requirements. However, we are looking for someone with this background so they’ll be able to reasonably follow and participate in discussion on topics that come to the Analytic team.
Some of the things required to be successful in the role:
- Strong communication skills, both written and spoken
- Technical experience with:
- technical languages such as Java, SQL, Scala, Python, and XML
- configuration and deployment: Git, Maven, Ansible, Octopus, and Jenkins
- system communication: HTTP/s, JMS, messaging, and JSON
- patterns: domain driven design, event stream processing, and functional programming
- data management: DBMS (OLTP), data-warehouse, data-lake, structured and unstructured
- platforms and frameworks: Cloud, Hadoop, and Spark
- It’s a plus if you have experience with:
- Linux virtual machines and containers
- systems like JIRA, Confluence, and Salesforce