- Demonstrable experience of Big Data technologies including any of Google Cloud Platform, Google Dataflows, Google BigQuery, Hadoop ecosystem and Spark at the enterprise level;
- Commercial development experience using programming languages such as Java, Python or Scala;
- An understanding and experience of Agile software development;
- Knowledge of how to architect and use Enterprise Bus/Data Integration solutions;
- Experience of SQL and NoSQL databases;
- Experience of automated testing and continuous integration (Jenkins, GoCD);
- Strong knowledge of distributed computing.
Any of the following is also great:
- Strong knowledge and experience of data-warehousing strategies and how they transfer to large data problemsMathematica, R, Python, Java, Scala, Matlab, Cloud environments etc;
- Demonstrable experience in creating reporting layer data objects for analytical reporting tools such as Tableau, Qlikview, SAS, Power BI;
- Hands on experience in developing and deploying machine learning models (deployment pipelines / model retraining / ways of exposing model / monitoring);
- Experience implementing data driven algorithmic systems in collaboration with data scientists;
- Exposure to real time big data processing (Spark Streaming, Kafka, Amazon Kinesis);
- Exposure to environments where businesses have become data driven;
- Able to explore new technology, strong innovation skills.
- Prototyping solutions using the latest Big Data technologies;
- Transitioning from prototype to deployed product;
- Aiding development teams to deliver data onto Ocado Technology’s new cloud-based data platform using the latest data integration technologies;
- Champion processes to ensure best data practices are followed;
- Collaborating with our Data Scientist over innovative paradigm changing models;
- Developing low latency data capture and analysis solution to enable near real-time business insight;
- Solve analytics problems and communicates results and methodologies.