amit singh_resume

6
AMIT SINGH Phone: (M) +91 9910607138 Pune, Maharashra - 411014 Email:[email protected] ~ HADOOP ETL DW Technology Lead ~ Seeking senior level assignments that would facilitate the maximum utilization and application of my broad skills and expertise in making a positive difference to the organization SUMMARY A value-driven and highly accomplished Cloudera Certified Hadoop Developer offering 8+ years of Software development experience in domains like Banking & Financial, Life Insurance, Telecommunications, Platform development, Online portal presently working with Barclays, Pune as Hadoop Technology Lead. Experience of all the phases of Project lifecycle like Requirement Analysis, Functional Specifications, Design Specifications, Coding, Testing, Implementation and Support Design/Development/LIVE Deployment for EDW on CDH 5.4 using Hive, Impala, Scalding, Sqoop, Cloudera Manager, Cloudera Navigator, HUE, Teradata. Review design/architecture, Building re-usable components, Live Deployment, Collecting performance Metrics from Live Hadoop, New technologies fitment, Live Data Proving, Referential integrity checks, Data Reconciliation. Experience designing Enterprise Data Warehouse on Hadoop performing Sparse History, SCD(Slowly Changing Dimensions) operations, CDC, Data Modelling. Good experience in Hadoop 2.4, Map Reduce, YARN, HDFS and ecosystem namely Pig, Hive, Sqoop 2, Flume Hands-on experience in Lambda Architecture and streaming and messaging frameworks like Kafka, spark, Confluent, Camus Well versed with NoSQL Databases like MongoDB, Cassandra Skilled at Tableau BI tool Good knowledge in Enterprise Search Engines like Apache Solr, Elastic Search, Kibana Extended expertise in Core JAVA, JSP, Servlets, SOA Architecture, Spring DI, Spring Core, Spring MVC, Liferay, Middleware Technologies Well versed with Agile methodologies Design and Development experience in EAI middleware suite SUN Java CAPS 5.1.3 and 6.2 Possesses Client site experience with Aegon (United States) & Safaricom (Kenya) Eminent leadership expertise with an exemplary record in driving teams for successful execution of projects Effective analyst, problem solver and communicator, able to forge solid relationships with upper level executive leaders and build consensus across multiple organizational levels A person with strong business acumen, excellent interpersonal relationship, possessing strong leadership and team building capabilities TECHNICAL FORTE Languages: Java, Scala, Shell Programming

Upload: amit-singh

Post on 11-Jan-2017

202 views

Category:

Documents


9 download

TRANSCRIPT

Page 1: Amit Singh_Resume

AMIT SINGHPhone: (M) +91 9910607138Pune, Maharashra - 411014Email:[email protected]

~ HADOOP ETL DW Technology Lead ~Seeking senior level assignments that would facilitate the maximum utilization and application of my broad skills and

expertise in making a positive difference to the organization

SUMMARY

A value-driven and highly accomplished Cloudera Certified Hadoop Developer offering 8+ years of Software development experience in domains like Banking & Financial, Life Insurance, Telecommunications, Platform development, Online portal presently working with Barclays, Pune as Hadoop Technology Lead.

Experience of all the phases of Project lifecycle like Requirement Analysis, Functional Specifications, Design Specifications, Coding, Testing, Implementation and Support

Design/Development/LIVE Deployment for EDW on CDH 5.4 using Hive, Impala, Scalding, Sqoop, Cloudera Manager, Cloudera Navigator, HUE, Teradata.

Review design/architecture, Building re-usable components, Live Deployment, Collecting performance Metrics from Live Hadoop, New technologies fitment, Live Data Proving, Referential integrity checks, Data Reconciliation.

Experience designing Enterprise Data Warehouse on Hadoop performing Sparse History, SCD(Slowly Changing Dimensions) operations, CDC, Data Modelling.

Good experience in Hadoop 2.4, Map Reduce, YARN, HDFS and ecosystem namely Pig, Hive, Sqoop 2, Flume Hands-on experience in Lambda Architecture and streaming and messaging frameworks like Kafka, spark,

Confluent, Camus Well versed with NoSQL Databases like MongoDB, Cassandra Skilled at Tableau BI tool Good knowledge in Enterprise Search Engines like Apache Solr, Elastic Search, Kibana Extended expertise in Core JAVA, JSP, Servlets, SOA Architecture, Spring DI, Spring Core, Spring MVC, Liferay,

Middleware Technologies Well versed with Agile methodologies Design and Development experience in EAI middleware suite SUN Java CAPS 5.1.3 and 6.2 Possesses Client site experience with Aegon (United States) & Safaricom (Kenya) Eminent leadership expertise with an exemplary record in driving teams for successful execution of projects Effective analyst, problem solver and communicator, able to forge solid relationships with upper level

executive leaders and build consensus across multiple organizational levels A person with strong business acumen, excellent interpersonal relationship, possessing strong leadership and

team building capabilities

TECHNICAL FORTE

Languages: Java, Scala, Shell Programming Big Data Technologies: CDH 5.4, HDFS, Pig, Hive, Impala, Sqoop 2, YARN, Map Reduce, Spark, Kafka,

Camus, Lambda Architecture, Tivoli Web Scheduler. NoSQL Databases: MongoDB, Cassandra BI Tool: Tableau Frameworks: Spring, Liferay, Struts 1.2, Hibernate Web Technologies: XML, HTML, JavaScript, Servlets, JSP Portal: Liferay Portal Application Servers: Jboss , Websphere, Weblogic, Tomcat Continuous Integration: Jenkins, Hudson, Nexus Tools: Eclipse, Apache Ant, Maven, JIRA, Crucible, Configuration Management: Git, SVN, PVCS, VSS

EXECUTIVE ACCOMPLISHMENTS

As Hadoop Technology Lead Successfully Designed, Develop, Live Deployed Enterprise Data warehouse use cases on CDH 5.4 (Hadoop)

using open source technologies namely Hive, Scalding, Impala, Sqoop, Scala, Java, Spark.

Page 2: Amit Singh_Resume

Actively invovled in design/coding of re-usable components using open source Big Data technologies like Scalding, Spark

As Technical Architect - BigData/Hadoop Successfully done PropIndex Automation using Hadoop, Map Reduce, Sqoop 2, Spring Batch, MongoDB,

Tableau Public, DB2 Actively invovled in data lake creation using the Open Source technologies existing in Hadoop

As Senior Software Engineer Understood Liferay CMS portal and configured the same as per project needs Achieved Performance Excellence Award for the timely delivery of Keystone platform work

As Software Engineer Recipient of ACE Award for the exceptional work performed for the Pretups product

CAREER CONTOUR

Barclays (BTCI), Pune (Sep’15 – Present)Hadoop Technology Lead - AVP

Project#1: 9 Product Stack – Building Enterprise Data Warehouse on Hadoop & Offloading BIW Teradata systems

Client: BarclaysDuration: Sep’15 – PresentEnvironment: Cloudera Hadoop 5.4.4

Technologies: Hadoop, Map Reduce, Hive, Impala, Scalding, Scala, Java, Sqoop, Teradata, Hue, Cloudera Manager, Spark.

Responsibility: Design, Development, Building re-usable components, Live Deployment, Collecting performance Metrics from Live Hadoop, New technologies fitment, Live Data Proving, Referential integrity checks, Data Reconciliation.

Description: Objective of this project is to build an enterprise data warehouse (EDW) which integrates data across nine product stacks like Transactions, Loans, Savings, Mortgages, Investment, Cash Banking, Debt Finance, Trade & Working Capital, Forex and provides enterprise reporting solution(single version of truth) across business units  by leveraging Hadoop platform using Open source technologies like Hive, Scalding, Impala, Scala & Java.

MAGICBRICKS, Noida (Nov’14 – Sep’15)Technical Architect – Big Data/Hadoop

Project#1: PropIndex Automation using Hadoop, Map Reduce, Sqoop 2, Spring Batch, MongoDB, Tableau Public, DB2

Client: MagicBricksDuration: Nov’14 – PresentDescription: As part of www.magicbricks.com a Property Index report is generated out of data collected

by user activity on website. This report was generated quarterly and required manual effort. The process to generate was DBA used to extract the data from DB2 tables and put it into excel and gave it to content and research team. The content team used to do some excel formulas and generate static pdf. Used hadoop for Batch processing, Map Reduce for performing computations, MongoDB for storing pre-computed results, Sqoop 2 for exporting data from DB2 to HDFS and Tableau Public for generating graphs using MongoDB as data source.

Project#2: Data lake creation with all data sources using Kafka, Confluent, Camus, HadoopClient: MagicBricksDuration: May’15– PresentDescription: Within magicbricks there are plenty of data sources like every user activity is tracked and

dumped into Db2, tracking and alerts data, BOT identification data, transactional data etc. The data is maintained in various data repository like DB2, MongoDB. They purge the old data as it crosses the threshold limits. Efficiently created an architecture and using the (distributed messaging systems) Kafka, Camus (ETL - Kafka topic to Hadoop) and HDFS. Successfully achieved to transport one such data source i.e. BOT traffic data directly to Data Lake.

Page 3: Amit Singh_Resume

PITNEY BOWES SOFTWARE, Noida (Oct’11 – Aug’14)Senior Software Engineer

Project#1: Keystone Platform - SecurityClient: Different ProductsDuration: Feb’13 – Aug’13Role: Understand the liferay autologin framework to leverage and extend single sign on

capabilities and liferay service architecture to expose some API capabilities as web services.Tech Stack: Java, Liferay Autologin Framework, Liferay Service Builder, Spring, JBoss 7, MySQL, MavenDescription: The keystone platform provides different modules and integration points for different

products. This project deals with the security aspect of the platform and utilizes the liferay autologin framework and service builder module for sso & ldap to provide a generic security component for different products.

Responsibilities: Analyzed requirements to define the functional specifications.Developed security module using the liferay’s service builder & Autologin framework.Defined deployment process on various platforms including app servers and for databases.

Project#2: Keystone PlatformClient: Internal ProductsDuration: Apr’12 – Jan’13Role: Execute POC's & suggest approach related to various aspects of the platform like portal

(liferay), jBPM designer integration with portal, identity manager (OpenAM)Tech Stack: Apache Hadoop, Cassandra, Core Java, Liferay, Alloy UI, OSGI, Maven, JBoss Description: A lot of the products in the company can be orchestrated together to unlock new business

opportunities but these products are working on different technologies, so a platform needs to be developed that will provide integrating capabilities for different products addressing common aspects like logging, security, workflow capabilities, portal & data services.

Responsibilities: Executed POC’s related to workflow, portal and data.Suggested approaches based on the findings in the POC’s.Active involvement with the architectural group in the technology selection process.

Project#3: eMessaging - Enhancements & Performance ImprovementClient: Santander, CitibankDuration: Oct’11 – Aug’14Role: Development of integration module & performance improvementTech Stack: Apache Hadoop, Cassandra, Core Java, Spring Core, My SQL, TomcatDescription: e-Messaging helps organizations maintain consistency and enhance personalization

acrossmulti-channel communication, including automated email and SMS texts. With this easily managed e-Messaging, companies can vastly improve call center response to email and or text communications.

Responsibilities: Evaluated product on Hadoop and NoSQL technology for performance bottlenecks and generated analytics for satisfying customer generated data processing.Certified the product against new platforms.Developed new module using the existing base framework.

COMPUTER SCIENCES CORPORATION, Noida (Jul’10 – Oct’11)Application Developer

Project#1: Integrated Competency CentreClient: AEGONDuration: Jul’10 – Oct’11Role: Development, ImplementationTech Stack: Java, JCAPS 5.1.3, 6.2, XML, WSDL, SOAP, MQ, JMSDescription: ICC is the JAVA-CAPS based implementations on 5.1.3 and 6.2 versions enabling AEGON’s

enterprise applications to talk to these interfaces and perform certain validations depending on the business requirement eq- Agent Validation, Policy Validation..

Responsibilities: Developed and tested new interfaces created as per the business needs.Integrated communication to various mainframe systems using Java Messaging service to post messages on various type of queues and topics.

Page 4: Amit Singh_Resume

MAHINDRA COMVIVA, Gurgaon (Jul’08 – Jun’10)Engineer

Project#1: PreTUPSClient: Safaricom, TigoDuration: Jul’08 – Jun’10 Role: Development and SupportTech Stack: Core Java, Struts 1.2, PL/SQL, Oracle 10g, Tomcat, Eclipse, VSSDescription: PreTUPS system is web based application used for recharging and billing of prepaid

subscribers. It enables the service provider to define the domains and hierarchy of entire retailer’s chain. The e-recharge is integrated with Intelligent Networks (IN), USSD Gateways, POS (External gateways) in order to receive requests for recharge and process. This is mobile-to-mobile top up system. PreTUPS provide the operator’s prepaid retailers with the ability to accept subscriber’s post-paid bill payments. This increases the operator’s collection points for its post-paid subscribers as they do not have to seek drop boxes or pay cheque to settle their post-paid bills.

Responsibilities: Designed HLD and LLD based on the Change Requests or new feature request.Responsible for the Development, Unit Testing and Implementation along with support.Environment Setup including Application Server setup and VSS related activities.Database setup for the application.Defect analysis and bug fixing.Migration of Live Environment to new code base using downtime.

CERTIFICATIONS & TRAININGS

Cloudera Certified Hadoop Developer, May 2014 Attended 2 days workshop for Certified Scrum Master role in November 2013 Attended Hadoop Administration workshop for deploying acluster on AWS in October 2014 Completed Scrum Master training programme at Pitney Bowes Software Undergone Agile methodologies programme at Pitney Bowes Software Completed LOMA Insurance learning programme at CSC

EDUCATIONAL CREDENTIALS

B.Tech. (Computer Engineering) from Jamia Millia Islamia, New Delhiin 2008with 7.13

PERSONAL VITAE

Date of Birth: 11th June, 1985Languages Known: English&HindiPreferred Location: United Kingdom / Singapore / United StatesPassport Details: G7184525, Feb/2018Visa Status: U.S.A B1 Visa – Valid till November, 2020