You are ready, if you ...
- want to join us on our mission to digitize the logistics world and use your expertise, skills and passion for data to help customers make smarter and faster data driven decisions that help in reducing CO2 emissions and today’s major market inefficiencies
- are a quick learner and strong problem solver with attention to detail, capable of taking on loosely defined problems as well as breaking down and simplifying complex technical concepts
- have a track record of designing and implementing reliable, scalable, secure and cost-efficient (data) architectures in the public cloud maintained using Infrastructure as Code to version and deploy changes
- gathered industry experience with AWS, most notably its core services (S3, IAM, CloudWatch, CloudTrail, VPC, EC2), the serverless stack and the CLI/SDKs; knowledge around distributed computing, stream processing and certifications are ‘nice to have’
- enjoy automation and are familiar with Git, the concepts behind DevOps as well as building and maintaining CI/CD pipelines
- have written software in Python, Java or another high-level programming language and have good knowledge of Linux, Shell scripting and containers (e.g. Docker / Kubernetes)