My client is looking for Scala/Spark Developer, and I wanted to know whether this would be of interest to you or not?
Job Title: : Scala/Spark Developer
Job Location: London, UK (Hybrid)
Duration: Permanent
- To develop and maintain applications using Hadoop, Scala programming for the work assigned and to be responsible for managing technology in projects and providing technical guidance / solutions for work completion.
- Candidate should have strong development experience with Big Data technologies, Hadoop, Scala programming.
- Additionally, experience in following: Java and knowledge of design patterns. Scala/Spark Hadoop-HDFS/MapReduce/YARN, Hive/HBase , DDL, DML, Sub Queries, Joins , Hierarchical Queries, Analytical Functions, Views & Mviews ,Unix Shell Scripting ,Machine learning/Algorithms Apache Kafka
- should have experience with development and CI/CD tools like Jira, unit testing and mocking frameworks eg. SVN and GIT, Teamcity, Maven.
- Should have worked in Agile environments. Should have good communication skills and client interfacing skills.
- Maintain personal effectiveness, embracing challenging deadlines, change and complex problem solving, approaching tasks with motivation and commitment.
If you are interested in this position, please send me your CV ASAP for immediate consideration or refer someone if this is not for you
Job Skill: Analytical Functions, DDL, DML, hadoop, Hadoop-HDFS/MapReduce/YARN, Hierarchical Queries, Hive/HBase, Joins, Machine learning/Algorithms Apache Kafka, Scala, Spark, Sub Queries, Unix Shell Scripting, Views & Mviews
Apply Now