Jobs at Makena Partners

View all jobs

Data Engineer

Bend, OR

DESCRIPTION

Are you passionate about building data pipelines that leverage the latest cutting-edge technologies in big data?  Our client's platform gathers data from millions of devices around the world, and we need your help to wrangle it!  As a Data Engineer, you will architect and own the pipeline and tools that allow us to extract valuable insights from our data, and provide those insights back to our customers through stunning and highly performant visualizations.  You’ll work closely with software engineers, cloud engineers, and data scientists to optimize our data infrastructure for maximum scalability and security, and enable machine learning and BI use cases.  Be prepared to work in a fast-paced and highly collaborative agile environment and help us deliver the next generation of features for the platform.

RESPONSIBILITIES

  • Build and maintain a scalable, secure, and cost-effective big data pipeline architecture
  • Build analytics tools for data mining, machine learning, and BI use cases
  • Leverage the Azure cloud for ingesting, storing and processing big data
  • Work with software engineers to optimize data ingestion and schema for processing
  • Work with cloud engineers to build and automate scalable infrastructure and keep our data secure
  • Work with data scientists and business analysts to extract actionable insights from our data
  • Actively seek out and implement ways to improve our underlying processes and technologies

QUALIFICATIONS

  • BS in Computer Science, Engineering, or equivalent related field
  • 5+ years of industry experience building and optimizing big data pipelines and datasets
  • Advanced SQL knowledge and experience working with relational databases including PostgreSQL
  • Strong working experience with Python, Java, and Scala
  • Experience with a variety of big data tools including Spark and Kafka
  • Experience with Azure cloud services; Event Hubs, Data Lake, Databricks, HDInsight, Power BI
  • Experience with stream-processing systems including Spark Streaming
  • Experience using data lakes and building ETL pipelines to create clean structured data models
  • Strong analytic skills related to working with unstructured datasets
  • An ability to work under pressure while maintaining a sense of humor
More Openings
Share This Job
Powered by