. Looking for Devops Engineer along with AWS
Responsibilities
Primary Skill Set . 5 years of experience in a Linux environment combined with a passion for its administration . Experience with tools like Ansible, Puppet, Jenkins, Graylog, ELK or Grafana . Backend: Java, Jetty, REST, SOA, WebRTC, MySQL, Ubuntu . Monitoring / Logging: Zabbix, Logentries . Deployment: Git, Jenkins, Docker . Environment: Continuous Integration with Dev, Testing, Productive environment . You bring good knowledge of networks like TCP UDP, DNS TLS and IPsec Secondary Skill Set Roles and Responsibilities . In charge of administrating our high-availability production system based on Ubuntu, Docker, Haproxy, Java Play! and MySQL . Secure the server and network infrastructure by applying secruity hardening and you automate recurring tasks using Ansible and Jenkins . Support the development team by maintaining the development and test environment running at AWS . Enhance our monitoring system based on Zabbix, Grafana and Graylog and moreover operate our data warehouse (Apache Spark and Tableau)
Requirements
Primary Skill Set . 5 years of experience in a Linux environment combined with a passion for its administration . Experience with tools like Ansible, Puppet, Jenkins, Graylog, ELK or Grafana . Backend: Java, Jetty, REST, SOA, WebRTC, MySQL, Ubuntu . Monitoring / Logging: Zabbix, Logentries . Deployment: Git, Jenkins, Docker . Environment: Continuous Integration with Dev, Testing, Productive environment . You bring good knowledge of networks like TCP UDP, DNS TLS and IPsec Secondary Skill Set Roles and Responsibilities . In charge of administrating our high-availability production system based on Ubuntu, Docker, Haproxy, Java Play! and MySQL . Secure the server and network infrastructure by applying secruity hardening and you automate recurring tasks using Ansible and Jenkins . Support the development team by maintaining the development and test environment running at AWS . Enhance our monitoring system based on Zabbix, Grafana and Graylog and moreover operate our data warehouse (Apache Spark and Tableau)