Start Date
Immediate
Expiry Date
08 Aug, 25
Salary
60.0
Posted On
09 May, 25
Experience
0 year(s) or above
Remote Job
Yes
Telecommute
Yes
Sponsor Visa
No
Skills
Good communication skills
Industry
Information Technology/IT
we need a strong NIFI developer, with a data engineer role who has good experience with NIFI and MongoDB, writing mongo queries, setting up NIFI flows, kafka streaming. Please share suitable profiles with VEEF scores.
Role : NIFI developer
Location: Texas Or St.Louis (onsite) must be locals
experience : 10+years
visa : Except Opt & Cpt
Passport Number Mandatory
Detailed JD:-
Responsibilities:
Data Flow Design and Implementation:
Design and implement NiFi data pipelines for various business needs, including data ingestion, transformation, and loading.
NiFi Configuration and Management:
Configure and manage NiFi clusters, including setting up processors, flow controllers, and other components.
Data Pipeline Optimization:
Optimize data flows for performance, scalability, and efficiency, addressing bottlenecks and improving processing times.
Troubleshooting and Debugging:
Identify and resolve issues within NiFi data pipelines, including data quality issues, performance problems, and security vulnerabilities.
Data Integration:
Integrate NiFi with other data platforms and systems, including databases, message queues, and cloud services, like MongoDB, SQL must have.
Data Quality and Security:
Ensure data quality and security throughout the data pipeline, implementing appropriate data validation, cleansing, and security measures.
Collaboration and Communication:
Collaborate with other data engineers, developers, and business stakeholders to understand requirements, design solutions, and ensure project success.
Documentation:
Document NiFi data pipelines, workflows, and procedures for future maintenance and troubleshooting.
Skills:
NiFi Expertise: Strong understanding of NiFi architecture, processors, flow controllers, and expressions.
Data Engineering: Experience with data ingestion, ETL, and data warehousing.
Programming: Proficiency in SQL, MongoDB Query, Splunk querying, Python, Java.
Databases: Knowledge of relational databases (SQL) and Mongo databases.
Big Data Technologies: Familiarity with big data technologies like Kafka, Hadoop, Spark, and cloud platforms (AWS, Azure, GCP).
Cloud Platforms: Experience with cloud platforms like AWS, Azure, or Google Cloud Platform.
Data Modeling: Ability to design and implement data models for analytical and reporting purposes.
Security: Understanding of data security principles and best practices.
Communication and Collaboration: Excellent communication and collaboration
Thanks
uday@synnoveglobal.net
Job Type: Contract
Pay: $55.00 - $60.00 per hour
Work Location: On the roa
Please refer the Job description for details