Log in with
Don't have an account? Create an account
Need some help?
Talk to us at +91 7670800001
Log in with
Don't have an account? Create an account
Need some help?
Talk to us at +91 7670800001
Please enter the 4 digit OTP has been sent to your registered email
Sign up with
Already have an account? Log in here
Need some help?
Talk to us at +91 7670800001
Jobs Search
Start Date
Immediate
Expiry Date
22 May, 25
Salary
0.0
Posted On
23 Jan, 25
Experience
0 year(s) or above
Remote Job
Yes
Telecommute
Yes
Sponsor Visa
No
Skills
Good communication skills
Industry
Information Technology/IT
DESCRIPTION
Many companies, such as Google, Facebook, and Amazon are building new specialized programming frameworks. This is because these companies need to allow their users to write simple, high-level code and run it efficiently on different hardware architectures. For example, Google has built TensorFlow, a framework for deep learning allowing users to run deep learning on multiple hardware architectures without changing the code.
Our research team at NYUAD (New York University Abu Dhabi) in collaboration with MIT (Massachusetts Institute of Technology, USA) is developing a new programming framework called Tiramisu [1]. Unlike existing frameworks, Tiramisu can perform advanced code optimizations that are hard to apply otherwise. Because of this, Tiramisu can generate fast code that outperforms highly optimized code written by expert programmers and can target different hardware architectures (multicore, GPUs, FPGAs, and distributed machines).
In order to have the best performance (fastest execution) for a given Tiramisu program, many code optimizations should be applied. Optimizations include vectorization (using hardware vector instructions), parallelization (running loop iterations in parallel), enhancing data locality by fusion, and blocking (i.e. accessing arrays in a way that improves temporal and spatial data locality). A large number of optimizations exist and choosing which optimization should be used and which should not is important for performance. In some situations, some optimizations are harmful to performance while they are beneficial in other situations. Currently, there is no way to help users choose which optimizations should be used. Expert Google programmers usually spend a lot of time trying different optimizations manually to find the best set of optimizations.
The goal of this project is to add support for automatic code optimization in Tiramisu. In particular, we want to use machine learning/deep learning to achieve this. Currently, a basic automatic optimization module that relies on machine learning has been developed and we want to take that module to the next level. The final product of this project would be a compiler pass that allows Tiramisu to automatically choose which optimization should be used for a given unoptimized program. We want to produce a high-quality technique that can be used by the users of Tiramisu and especially by our partner companies and research labs.
Please refer the Job description for details