• New Training Session Sign In Process

    We will be beta testing a barcode based sign in process (replacing the physical sign-in sheets) for our monthly training sessions. The barcode that will be used for the sign-in is at the bottom of the registration confirmation email. You can use your cell phone or a paper print out of the registration confirmation email to sign in. This will enable us to track attendance better and also enable uploading of session CPEs directly to ISACA International.

Metlife

Software Development Engineering Manager - Big Data (108859)

Return to Postings
Posted On:
Closing On:
05-Mon-2019
06-Thu-2019

Job Description:

Software Development Engineering Manager - Big Data(108859)

Role Value Proposition: 

We are looking for a Big Data Engineer who will work on the ingesting, storing, processing, and analyzing of huge sets of data. Transforming the data using Bigdata tools. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the company.

 Key Responsibilities: 

  • · Ingesting huge volumes data from various platforms for Analytics needs.
  • · Building and Implementing ETL process developed using Big data tools such as Spark(scala/python),Hive Nifi etc.
  • · Monitoring performance and advising any necessary infrastructure changes.
  • · Perform code reviews and ensure the code is production ready and follows the best practices and standards.

 Supervisory Responsibilities:  Leads and motivates project team members that are not direct reports, as well as providing work direction to lower-level staff members making sure they follow the best practices and standards put together.

Preferred Skills:

 Essential Business Experience and Technical Skills:

Required:

  • · 10+ years of solutions development experience
  • · Extensive Experience with Spark & Scala, Python and performance tuning is a MUST
  • · Proficiency and extensive experience in HDFS, SQOOP Hive, Pig, Flume, Kafka etc.
  • · Performance tuning and problem solving skills is a must
  • · Proficiency and extensive experience in developing SQLs and performance tuning.
  • · Degree in Computer Science or related field

Preferred:

  • · Experience with Informatica PC/BDM 10 and implemented push down processing into Hadoop platform, is a huge plus.
  • · Experience in building ETL pipelines using Nifi.
  • · Experience dealing with unstructured data formats while ingesting in to the platform and exposing for consumption purposes.
  • · Experience with NoSQL databases, such as HBase
  • · Data warehousing concepts and Experience in migrating legacy warehouses into Big data platform.
  • · Experience with Cloud platforms such as Azure is a huge plus.
  • · Defining data security principals and policies using Ranger and Kerberos

https://career8.successfactors.com/sfcareer/jobreqcareer?jobId=108859&company=mlprod&username=

Powered by JobGrok