1. Overall experience of 5.5 years with Minimum 4 years of relevant experience in Big Data
technologies.
2. Hands-on experience with the Hadoop stack HDFS, sqoop, kafka, Pulsar, NiFi, Spark,
Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in
building an end to end data pipeline. Working knowledge on real-time data pipelines is
added advantage.
3. Strong experience in at least of the programming language Java, Scala. Java
preferable
4. Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb,
Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc.
5. Well-versed and working knowledge with data platform related services on GCP
6. Bachelor’s degree and year of work experience of 6 to 8 years or any combination of
education, training and/or experience that demonstrates the ability to perform the
duties of the position.
To join our team, we’re seeking for an experienced IT expert in ERP software. The job includes managing a variety...
Apply For This JobJob Description System Analyst – Information Technology Job Objective – We are looking for an experienced System Analyst to join...
Apply For This JobEducation: University Degree preferable, ITIL Practioner qualification desirable, 2nd language advantageous. Experience Target: Ideally 8+ years’ experience within a customer...
Apply For This JobAt ICON, it’s our people that set us apart. Our diverse teams enable us to become a better partner to...
Apply For This JobImmediate joiner or less then 20 days NP Mandatory Skills Field Engineer, having sound knowledge on Windows and Linux Troubleshooting...
Apply For This JobProviding primary backup for networks should they fail Ensuring the effective administration of a new business network Performing troubleshooting on...
Apply For This Job