Big Data Engineer


: $106,390.00 - $159,250.00 /year *

Employment Type

: Full-Time


: Information Technology

Loading some great jobs for you...

Overall purpose of role We are in need of a Big Data Engineer Developer to join our expanding Surveillance development team in the world-class Whippany office. This person will be involved in holistic surveillance build out which will require excellent knowledge and hands-on experience in big data (Hadoop - Cloudera stack) technologies like hive, spark, Kafka, Angular JS to architect, develop and support sophisticated analytics pipeline using Machine Learning algorithms. Ideal candidate must have excellent analytical skills and Compliance business knowledge. Further, he or she must be extremely organized and should have good oral and written communication skills. Experience working with global team which included Data Scientist, Business analyst and developers is really beneficial. Key Accountabilities Architect, design and develop analytical model using Big Data Technologies like Hadoop, SparkScala, Hive, python, Kafka etc. Build graph ingestion process and support graph User-Interface development using DataStax, Gremlin, D3, Angular. Design robust code from the point of view of performance, reuse and supportability, proper controls, consistent with best practices and with appropriate documentation. Support migration of Hadoop cluster which includes migration of feeds, ETL processes (Ab-Initio), scripts, Autosys and machine learning modelscode. Implement organization mandated SDLC Agile processes and built in controls for consistent delivery. Demonstrate proactive and independent attitude while identifying or approaching a problem. Show ability to identify the root cause, coming up with an efficient solution and if needed appropriately escalate issues risks. Train new members (both onshore and offshore) which includes code-reviews, mentoring to bring them up to speed. Should be able to quickly understand current environment and work closely with OffshoreOnsite development teams, infrastructure teams, support and source systems contacts. Discuss requirements and testing details with Tech Lead, Data Scientist, QA and business stakeholders. Demonstrate ability to self-learn new technologies. Should be actively plugged into development and technology advancement happening in our domainindustry. Possess good business domain knowledge in Compliance, Financial Crime and Financial Services. Ability to handle multiple tasks simultaneously with frequently changing priorities. Stakeholder Management and Leadership Ability to convey ideas to senior managers, leads and business stakeholders. Prepare presentation and communicate effectively with Technical managers and leads to make them easily understand the architecture and the solution being proposed. Decision-making and Problem Solving Demonstrate proactive and independent attitude while identifying or approaching a problem. Show ability to identify the root cause, come up with an efficient cost-effective solution and as needed appropriately escalate issues risks. Risk and Control Objective Ensure that all activities and duties are carried out in full compliance with regulatory requirements, Enterprise Wide Risk Management Framework and internal Barclays Policies and Policy Standards. Person Specification Essential SkillsBasic Qualifications BS degree in Computer Science, Information Technology or related discipline 5 years hands-on experience as a big data developer over with a proven record of multiple successful deliveries in Hadoop and SparkScala 2 years' experience in Python, Angular or Graph technologies 3 years' hands-on in UNIX, Autosys and Java Desirable skillsPreferred Qualifications Experience in a variety of database technologies mainly Hive, Impala, Oracle or SQL server 1 year experience in Kafka Hands-on experience with any Business Intelligence tools (preferably Tableau) Good understanding of analytics and machine learning algorithms. Able to influence architecturedesign and communicate effectively to Senior management and stakeholders. Hands-on experience in Ab-Initio ETL tool and Tableau business intelligence tool.
Associated topics: data architect, data integration, data manager, data scientist, data warehouse, database administrator, etl, erp, hbase, mongo database * The salary listed in the header is an estimate based on salary data for similar jobs in the same area. Salary or compensation data found in the job description is accurate.

Launch your career - Upload your resume now!

Upload your resume

Loading some great jobs for you...