We are on a mission to empower Enterprise organizations to get more value out of their data and we do it responsibility with a high regard for data security! We are solving some truly amazing challenges and you would be a great fit if you embody a growth mindset, understand what it means to put the customer first, you are a go-getter and value the team over individual, you are accountable, hard working, value transparency, and are looking to join a fast-paced startup environment that truly wants to make a difference.
Be sure to check out our most exciting recent press below especially being labeled as one of the 10 coolest big data start-ups in 2020!
Role and Responsibilities:
- Architect, design and develop new features based on customer and stakeholder requirements.
- Take abstract problems and iteratively distill and build scalable, sustainable, performant solutions.
- Communicate regularly with the engineering leadership and product managers to ensure the project is on track and checkpoint goals are met.
- Demonstrate a deep understanding of what it takes to build a database engine. When designing solutions, consider key concepts like multi-threading, parallel processing, memory management and file management to name a few.
- Understand and utilize AWS SDK libraries to leverage various AWS capabilities like RDS, S3, Cloudtrail.
- Create clear and concise documents about work done for projects, investigative work including design documents, architectural diagrams and other technical documents as needed.
- Give high priority for quality of work by creating unit tests, integration tests and when needed include it as part of the CI process.
- Ensure code runs in docker with minimum to none changes needed between development to production environment.
- Participate in design discussions of other projects within the engineering team and able to guide on technical aspects for junior developers.
- Take up challenges faced within the engineering team like problems in the release process, build issues and other pressing problems raised by customers that require quick turnaround with the highest quality of work.
- Be a regular member of the support rotation.
- Bachelor's or Master's degree in computer science or related discipline.
- Hands on experience working within large data platforms/data lakes/data warehouses.
- You have 5+ years of experience in at least one of object-oriented programming languages like Java, C++, Python, or Golang.
- You have experience with SQL, NoSQL, AWS and/or distributed computing systems.
- Understand database concepts and optimization techniques.
- Have debugging experience with highly concurrent systems, have solved scaling problems and optimized memory/CPU utilization.
- Experience with docker and containerization is a plus.
- Experience with CI/CD and other build systems.
Technologies we use and learn together:
- Experience working on big data technologies like Impala, Apache Spark, Hadoop, Hive.
- Amazon Web Services
- Microsoft Azure Docker Kubernetes EMR / Spark / Hive / Databricks
- AVRO / Parquet / JSON / CSV data file formats