Design and develop on Hadoop applications,Hands on in developing Jobs in pySpark with Python/ SCALA (Preferred) or Java.../ SCALAExperience on Core Java, Experience on Map Reduce programs, Hive programming, Hive queries performance conceptsExperience......
Job Location: Bangalore, Karnataka, IndiaSelected articles on work and employment, which may be found interesting:
What to do if you are excessively loaded with the work?Find more articles on Articles page