전체메뉴

닫기

한국전자통신연구원

  • 임베디드 시스템용 딥러닝 추론엔진 기술 동향

    발행일 : 2019.08.06
    Deep learning is a hot topic in both academic and industrial fields. Deep learning applications can be categorized into two areas. The first category involves applications such as Google Alpha Go using interfaces with human operators to run complicated inference engines in high-performance servers. The second category includes embedded applications for mobile Internet-of-Things devices, automotive vehicles, etc. Owing to the characteristics of the deployment environment, applications in the second category should be bounded by certain H/W and S/W restrictions depending on their running environment. For example, image recognition in an autonomous vehicle requires low latency, while that on a mobile device requires low power consumption. In this paper, we describe issues faced by embedded applications and review popular inference engines. We also introduce a project that is being development to satisfy the H/W and S/W requirements.