Data Engineer

at  MiC3 International

Johannesburg, Gauteng, South Africa -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate25 Dec, 2024Not Specified27 Sep, 2024N/AGood communication skillsNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

DATA INTEGRATIONS ENGINEER (NIFI)

As a Data Integration Engineer, you will be responsible for designing, implementing, and maintaining data integration solutions to handle real-time streaming data from various sources like IoT/IIoT protocols, third-party APIs, or even raw files. Your main objective will be to process data in real-time and provide valuable insights for our organization. You will work with a diverse range of Big Data tools and technologies. The successful candidate will have experience in embedded systems bringup requirements engineering management, systems integration, developing programs to drive HW and SW planning and articulating the big picture.
Additionally, you will be involved in the development of a Data Streaming platform using Nifi

Responsibilities:

  • Data Integration Design: Collaborate with cross-functional teams to understand data requirements, source systems, and data formats.
  • Design efficient data integration pipelines for real-time data streaming from multiple sources.
  • Programming Languages: Develop custom data processing components and applications using Java and Python to meet specific business requirements.
  • ETL Development: Implement Extract, Transform, Load (ETL) processes to ingest and transform data from various streaming sources into a format suitable for analysis and storage.
  • Real-time Data Processing: Develop and optimize data processing workflows to ensure timely handling of streaming data, maintaining low-latency and high-throughput capabilities.
  • Big Data Tools: Utilize and maintain various Big Data tools such as Apache NiFi, Spark, Kafka etc. to build scalable and robust data integration solutions.
  • Message Broker Configuration: Set up and configure message brokers like RabbitMQ, AMQP, and Kafka to enable efficient data exchange between different systems and applications.
  • IoT/IIoT Protocols Integration: Integrate and work with IoT/IIoT protocols such as MQTT, SNMP, CoAP, TCP, and WebSockets to capture data from edge devices and industrial systems.
  • Data Quality and Validation: Implement data validation checks and data quality measures to ensure the accuracy and reliability of the integrated data.
  • Performance Monitoring: Monitor the performance and health of data integration pipelines, making necessary adjustments to optimize data flow and resource utilization.
  • Troubleshooting and Issue Resolution: Diagnose and resolve issues related to data integration, ensuring smooth and uninterrupted data streaming.
  • Technical Requirements:
  • Strong experience in designing and implementing data integration solutions for real-time streaming data.
  • Proficiency in using Big Data technologies such as Apache NiFi, Apache Spark and Kafka
  • Familiarity with message brokers like RabbitMQ, AMQP, and Kafka for data exchange and event-driven architectures.
  • Hands-on experience with IoT/IIoT protocols such as MQTT, SNMP, CoAP, TCP, and WebSockets.
  • Proficiency in programming languages such as Java and Python for developing custom data processing components.
  • Knowledge of data quality assurance and validation techniques to ensure reliable data.
  • Ability to troubleshoot and resolve issues related to data integration and streaming processes.
  • Strong analytical and problem-solving skills, with a keen eye for detail.
  • Excellent communication and teamwork skills to collaborate effectively with cross-functional teams.
  • Experience with cloud-based platforms and distributed systems is advantageous.
  • Have a never ending curious mindset to learn and work with new tools and technologies

Responsibilities:

  • Data Integration Design: Collaborate with cross-functional teams to understand data requirements, source systems, and data formats.
  • Design efficient data integration pipelines for real-time data streaming from multiple sources.
  • Programming Languages: Develop custom data processing components and applications using Java and Python to meet specific business requirements.
  • ETL Development: Implement Extract, Transform, Load (ETL) processes to ingest and transform data from various streaming sources into a format suitable for analysis and storage.
  • Real-time Data Processing: Develop and optimize data processing workflows to ensure timely handling of streaming data, maintaining low-latency and high-throughput capabilities.
  • Big Data Tools: Utilize and maintain various Big Data tools such as Apache NiFi, Spark, Kafka etc. to build scalable and robust data integration solutions.
  • Message Broker Configuration: Set up and configure message brokers like RabbitMQ, AMQP, and Kafka to enable efficient data exchange between different systems and applications.
  • IoT/IIoT Protocols Integration: Integrate and work with IoT/IIoT protocols such as MQTT, SNMP, CoAP, TCP, and WebSockets to capture data from edge devices and industrial systems.
  • Data Quality and Validation: Implement data validation checks and data quality measures to ensure the accuracy and reliability of the integrated data.
  • Performance Monitoring: Monitor the performance and health of data integration pipelines, making necessary adjustments to optimize data flow and resource utilization.
  • Troubleshooting and Issue Resolution: Diagnose and resolve issues related to data integration, ensuring smooth and uninterrupted data streaming.
  • Technical Requirements:
  • Strong experience in designing and implementing data integration solutions for real-time streaming data.
  • Proficiency in using Big Data technologies such as Apache NiFi, Apache Spark and Kafka
  • Familiarity with message brokers like RabbitMQ, AMQP, and Kafka for data exchange and event-driven architectures.
  • Hands-on experience with IoT/IIoT protocols such as MQTT, SNMP, CoAP, TCP, and WebSockets.
  • Proficiency in programming languages such as Java and Python for developing custom data processing components.
  • Knowledge of data quality assurance and validation techniques to ensure reliable data.
  • Ability to troubleshoot and resolve issues related to data integration and streaming processes.
  • Strong analytical and problem-solving skills, with a keen eye for detail.
  • Excellent communication and teamwork skills to collaborate effectively with cross-functional teams.
  • Experience with cloud-based platforms and distributed systems is advantageous.
  • Have a never ending curious mindset to learn and work with new tools and technologie


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Proficient

1

Johannesburg, Gauteng, South Africa