Working student / intern (all genders) Data Engineer
at AEB SE
Stuttgart, Baden-Württemberg, Germany -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 31 Jan, 2025 | Not Specified | 01 Nov, 2024 | N/A | Google Cloud,Optimization,Computer Science,Java,Data Products,Data Structures,Models,Pipeline Development,R,Airflow,Azure,Aws,Root,Programming Languages,Reliability,Database Systems,Storage Solutions,Python,Spark | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
Who we are
At AEB, we use intuitive software to enable efficient, safe, ecological and equitable supply chains. More than 7,000 companies like this so much that they rely on our IT solutions for the areas of foreign trade, customs, export control and logistics – in over 80 countries.
With us, you get the freedom and open structures to really make a difference – and a working environment that ensures that you can achieve your best.
What you’ll be working on
More than just observing, you’ll be actively contributing to meaningful projects. Under the guidance of experienced colleagues, you’ll gain deep insights into the customs domain and help build products that truly serve our customers’ needs. You will work within a product operating model and collaborate with cross-functional unit members such as product managers, business analysts and data scientists. Your future tasks will include:
- Assist in the design, development and optimization to create scalable and efficient data pipelines that handle large volumes of data
- Learn and apply best practices in data pipeline development
- Help in building ETL/ELT processes to move and transform data from various sources
- Participate in integrating data from multiple internal and external sources
- Assist in optimization of data storage solutions and ensure efficient handling of large datasets
- Conduct data validation checks to ensure accuracy and reliability
- Assist in documenting processes, models and data products
- Assist in troubleshooting data issues and performing root cause analysis
Qualifications
- Currently enrolled in a Bachelor’s degree or higher in Computer Science, Engineering, or related fields
- Good understanding of data structures, relational database systems and experience in writing advanced SQL queries
- Understanding of NoSQL / object-oriented databases and knowledge of good system design practices is a plus
- Familiarity with programming languages such as Python, Java or R
- Exposure to data processing frameworks and tools like Spark or Airflow
- Familiarity with cloud platforms such as AWS, Azure, or Google Cloud
- Understanding of ETL processes and data pipeline tools
- Strong problem-solving skills and attention to detail
Perhaps you also have other suitable skills. We look forward to getting to know you and your wealth of experience.
What you can rely on
We treat all our employees equally - including future ones. This means that we do not tolerate discrimination, for example on the basis of age, disability, gender, sexual orientation, ethnic origin or religion. We also create a working environment that meets the diversity and different demands of all people. The only thing that counts for us is what you can do and what you want to achieve with us
Responsibilities:
- Assist in the design, development and optimization to create scalable and efficient data pipelines that handle large volumes of data
- Learn and apply best practices in data pipeline development
- Help in building ETL/ELT processes to move and transform data from various sources
- Participate in integrating data from multiple internal and external sources
- Assist in optimization of data storage solutions and ensure efficient handling of large datasets
- Conduct data validation checks to ensure accuracy and reliability
- Assist in documenting processes, models and data products
- Assist in troubleshooting data issues and performing root cause analysi
REQUIREMENT SUMMARY
Min:N/AMax:5.0 year(s)
Information Technology/IT
IT Software - DBA / Datawarehousing
Software Engineering
Graduate
Computer science engineering or related fields
Proficient
1
Stuttgart, Germany