The candidate should also have in-depth understanding of real-time data harvesting, parsing, storing and modeling, and will be responsible for architecture and design of Enterprise Data Hub, Data Lake and Data Catalogs.
Role and Responsibilities
- Demonstrated experienced with the following: Data Governance, Data Architecture, Data Modeling & Design, Metadata, Master Data, Data Quality (to include data profiling), Content Management, Data Warehousing & Business Intelligence, Database Management (including Changed Data Capture (CDC)), Data Security, Data Integration (to include streaming data).
- Lead teams of 2-10+ technical resources.
- Experience with cloud platforms (AWS, Azure), including readiness, provisioning, security, and governance.
- Solid experience with designing and architecting large scale Big Data applications leveraging on-premise and/or cloud platforms, and Big Data architecture patterns like Data lakes, Lambda, Kappa.
- Active involvement in thought leadership Best Practices development, Written White Papers and/or POVs on Big Data Technology.
- Provide Architectural leadership and vision for MaestrOS Data Platform. Develop and maintain architectural roadmap for products and services plus ensuring alignment with the business and enterprise architecture strategies and standards. Able to develop, present and explain the value and vision of proposed architectures and solutions to a wide range of audience.
- Maintain broad technical and solution knowledge in software, infrastructure, big data, security, and business architecture areas.
- Engage with Data Platform customers/partners to understand their needs and provide suggestions/guidance on what PaaS techniques are appropriate to solve their problems with pros/cons articulated clearly.
- Drive internal proof of concept initiatives. When needed, quickly design and implement a prototype of a system or component with a proper architecture, and then hand over to (may lead) a small group of devs to finish.
- Build relationships with key architects across technology organizations and collaborate on promoting architectural best practices across technology.
- Deep knowledge and hands on experience on big data and cloud computing technologies including Hadoop, Spark, Hive, HBase, etc.
- Hands-on experience with Hadoop, Teradata (or other MPP RDBMS), MapReduce, Hive, Sqoop, Splunk, STORM, SPARK, Kafka and HBASE (at least 2 years).
- Experience with end-to-end solution architecture for data capabilities including:
- Experience with ELT/ETL development, patterns and tooling (Informatica, Talend).
- Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms.
Spencer Ogden is acting as an Employment Agency in relation to this vacancy.
Applying For This Position
Unless otherwise stated, when applying for a job, you should ensure that you're already authorised to work in the country where the role is located.
There is never a need to provide your bank account details when applying for a job.
Skills and experiencebig data, architecture,clour, data lakes, lambda, kappa, data platform, spark, hadoop, hive, hbase, storm, splunk, etl, informatica, data integration, design, applications, aws, azure, roadmap, storming, enterprise architecture
SPENCER OGDEN: THE AWARD-WINNING GLOBAL ENERGY, ENGINEERING AND INFRASTRUCTURE RECRUITMENT SPECIALISTS.
Spencer Ogden stands for the best in specialist, professional, proactive recruitment, working with skilled Energy, Infrastructure and Engineering professionals across the globe.
UNRIVALLED SERVICE IN EVERY SECTOR
Spencer Ogden represents clients and candidates at mid to senior levels which operate globally within:
• Oil & Gas
• Built Environment
Our client base includes major operators, utility providers, manufacturers, developers, research institutes, consultancies and international government bodies.