Data Engineer – 12 month contract – Big Data/ Oracle/ Java
- Job Ref: 9540
- Dublin
- IT - Data (Engineers, Science etc)
Job Title: Senior Data Engineer
12 month Contract
Location: South Dublin / Hybrid Work Model
Senior Data Engineer
Are you ready to be part of a dynamic team at the forefront of innovation in financial security? Join our client, a payments software company, in their Security Innovation program, where they are utilizing cutting-edge tools to build AI and Machine Learning models that make a real impact.
Key Responsibilities:
- Optimize Data Pipeline Architecture: Create and maintain robust data pipeline architecture, ensuring seamless data flow and collection for cross-functional teams.
- Process Improvement and Automation: Identify and implement process enhancements, automating manual tasks and optimizing data delivery for greater efficiency.
- ETL Expertise: Build infrastructure for efficient extraction, transformation, and loading of data from diverse sources using ETL processes and modern cloud technologies.
- Root Cause Analysis: Analyze both internal and external data to address business queries and uncover opportunities for enhancement.
- Take ownership of requirement clarification and propose effective solutions prior to implementation.
Requirements:
- Experience with big data tools: Hadoop, Spark, Databricks, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres, Oracle and CosmosDB
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Java, Python, etc.
- Working knowledge of message queuing, stream processing, and highly scalable big data data stores.
- Advanced SQL Proficiency: Extensive experience in working with relational databases, query authoring, and proficiency with various database systems.
- Data Engineering Expertise: Proven track record of building and optimizing data pipelines, architectures, and datasets.
- Analytical Prowess: Strong analytical skills with unstructured datasets.
- Big Data Tools Mastery: Experience with Hadoop, Spark, Databricks, Kafka, and other big data tools.
- Cloud and Database Skills: Familiarity with cloud services, relational SQL, NoSQL databases (e.g., Postgres, Oracle, CosmosDB).
- Programming Languages: Working knowledge of object-oriented or function scripting languages such as Java, Python, etc.
- Project Management: Strong project management and organizational skills.
- Cross-functional Collaboration: Experience in collaborating with cross-functional teams in a dynamic environment.
- Workflow Management: Familiarity with data pipeline and workflow management tools.
- Stream Processing Systems: Experience with systems like Storm, Spark-Streaming, etc.
.
Want to know more? Apply today! or email
[email protected] for more information.