Pages

Categories

Priorities?

9/19/21, by artyom ; 0 comments

DLPrimitives already gives promising results... But I'm really wondering what to prioritize:

  1. Add more useful operators (dropout, upscale, lstm, prelu, mse-loss etc) to make DLPrimitives fully featured?
  2. Try to improve existing OpenCL frameworks like Caffe (or PlaidML) by using DLPrimitives core operations?
  3. Start working on pytorch OpenCL backend - that is huge undertaking?
  4. Work on support of float16/bfloat16?
  5. Continue improving performance by integrating with open source implementations for Arm-Mali, Intel?

Every task is important.

It is logical to add more operators so DLPrimitives - DL framework can be useful for real world tasks - it can be done relatively fast since most of operators aren't that complex.

But in order to make it really useful (and not niche) it need to be integrated to at least one of the popular frameworks like Pytorch, TF or Mxnet. On the other hand implementing pytorch backend is huge task that will take lots of time - but it is actually the true goal.

I can go with improving Caffe-OpenCL were I mostly need to fix several performance critical layers by using dlprimitives... ahhh and fix Caffe memory management since Keras/PT uses 1/4 of the memory Caffe uses. It can be good POC but Caffe is actually dead - I already have working POC.

Hard to decide.

Add Comment:

 
 the email would not displayed
 

You can write your messages using Markdown syntax.

You must enable JavaScript in order to post comments.