Frame Filtering and Skipping for Point Cloud Data Video Transmission

Frame Filtering and Skipping for Point Cloud Data Video Transmission

Volume 2, Issue 1, Page No 76-83, 2017

Author’s Name: Carlos Morenoa), Ming Li

View Affiliations

Department of Computer Science, California State University, Fresno, 93740, USA

a)Author to whom correspondence should be addressed. E-mail:

Adv. Sci. Technol. Eng. Syst. J. 2(1), 76-83 (2017); a DOI: 10.25046/aj020109

Keywords: Filtering, Frame Skipping, Point Clouds



Problem Downloading File? Alternate Link

Export Citations

Sensors for collecting 3D spatial data from the real world are becoming more important. They are a prime research area topic and have applications in consumer markets, such as medical, entertainment, and robotics. However, a primary concern with collecting this data is the vast amount of information being generated, and thus, needing to be processed before being transmitted. To address the issue, we propose the use of filtering methods and frame skipping. To collect the 3D spatial data, called point clouds, we used the Microsoft Kinect sensor. In addition, we utilized the Point Cloud Library to process and filter the data being generated by the Kinect. Two different computers were used: a client which collects, filters, and transmits the point clouds; and a server that receives and visualizes the point clouds. The client is also checking for similarity in consecutive frames, skipping those that reach a similarity threshold. In order to compare the filtering methods and test the effectiveness of the frame skipping technique, quality of service (QoS) metrics such as frame rate and percentage of filter were introduced. These metrics indicate how well a certain combination of filtering method and frame skipping accomplishes the goal of transmitting point clouds from one location to another. We found that the pass through filter in conjunction with frame skipping provides the best relative QoS. However, results also show that there is still too much data for a satisfactory QoS. For a real-time system to provide reasonable end-to-end quality, dynamic compression and progressive transmission need to be utilized.

Received: 18 December 2016, Accepted: 19 January 2017, Published Online: 28 January 2017

  1. Moreno, M. Li, “A comparative study of filtering methods for point clouds in real-time video streaming,” Proceedings of the World Congress on Engineering and Computer Science, San Francisco, CA, 2016.
  2. Miknis, R. Davies, P. Plassmann, A. Ware, “Near real-time point cloud processing using the PCL,” 2015 International Conference on Systems, Signals and Image Processing, London, 2015.
  3. Fu, D. Miao, W. Yu, S. Wang, Y. Lu, S. Li, “Kinect-Like Depth Data Compression,” in IEEE Transactions on Multimedia, 15(6), 1340-1352, 2013.
  4. Kammerl, N. Blodow, R. B. Rusu, S. Gedikli, M. Beetz, E. Steinbach, “Real-time compression of point cloud streams,” 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, 2012.
  5. B. Rusu, Z. Marton, N. Blodow, M. Dolha, M. Beetz, “Towards 3D point cloud based object maps for household environments,” Robotics and Autonomous Systems, 56(11), 927-941, 2008.
  6. Nenci, L. Spinello, C. Stachniss, “Effective compression of range data streams for remote robot operations using H.264,” 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, 2014.
  7. Belson, J. Thompson, J. Sun, R. Möller, M. Sintorn, G. Huston, “The state of the Internet,” Akamai, Cambridge, MA, Tech. Rep., 2015.
  8. Zhang, H. Shi, S. Wan, “Dynamic frame-skipping scheme for live video encoders,” 2010 International Conference on Multimedia Technology, Ningbo, 2010.
  9. Qi, M. Dai, “The effect of frame freezing and frame skipping on video quality,” 2006 International Conference on Intelligent Information Hiding and Multimedia, Pasadena, CA, 2006.
  10. Fabian, T. Young, J. C. P. Jones, G. M. Clayton, “Integrating the Microsoft Kinect with Simulink: real-time object tracking example,” in IEEE/ASME Transactions on Mechatronics, 19(1), 249-257, 2014.
  11. L. Lai, Y. L. Huang, T. K. Liao, C. M. Tseng, Y. F. Chen, D. Erdenetsogt, “A Microsoft Kinect-based virtual rehabilitation system to train balance ability for stroke patients,” 2015 International Conference on Cyberworlds, Visby, 2015.
  12. Deng, H. Li, J. Cai, T. J. Cham, H. Fuchs, “Kinect shadow detection and classification,” 2013 IEEE International Conference on Computer Vision Workshops, Sydney, NSW, 2013.
  13. F. Lu, J. S. Chiang, T. K. Shih, S. Wu, “3D sphere virtual instrument with Kinect and MIDI,” 2015 8th International Conference on Ubi-Media Computing, Colombo, 2015.
  14. T. Hsieh, “An efficient development of 3D surface registration by Point Cloud Library (PCL),” 2012 International Symposium on Intelligent Signal Processing and Communications Systems, New Taipei, 2012.
  15. Zhang, L. Kong, J. Zhao, “Real-time general object recognition for indoor robot based on PCL,” 2013 IEEE International Conference on Robotics and Biomimetics, Shenzhen, 2013.
  16. Ouyan, T. Zhang, “Octree-based spherical hierarchical model for collision detection,” 2012 10th World Congress on Intelligent Control and Automation, Beijing, 2012.
  17. He, M. Zhu, C. Gu, “3D sound rendering for virtual environments with octree,” IET International Conference on Smart and Sustainable City 2013, Shanghai, 2013.
  18. Zhou, H. Ma, Y. Chen, “A frame skipping transcoding method based on optimum frame allocation in sliding window,” 2010 2nd International Conference on Signal Processing Systems, Dalian, 2010.
  19. Bhattacharyya, E. Piccinelli, “A novel frame skipping method in transcoder, with motion information, buffer fullness and scene change consideration,” 2009 17th European Signal Processing Conference, Glasgow, 2009.
  20. L. Jung, T. Chung, K. Song, C. S. Kim, “Efficient stereo video coding based on frame skipping for real-time mobile applications,” IEEE Transactions on Consumer Electronics, 54(3), 1259-1266, 2008.
  21. Jana, Kinect for Windows SDK Programming Guide, Packt, 2012.
  22. B. Rusu, S. Cousins, “3D is here: Point Cloud Library (PCL),” 2011 IEEE International Conference on Robotics and Automation, Shanghai, 2011.
  23. Orts-Escolano, V. Morell, J. García-Rodríguez, M. Cazorla, “Point cloud data filtering and downsampling using growing neural gas,” The 2013 International Joint Conference on Neural Networks, Dallas, TX, 2013.
  24. Tomasi, R. Manduchi, “Bilateral filtering for gray and color images,” 6th International Conference on Computer Vision 1998, Bombay, 1998.
  25. W. Lim, J. Ha, P. Bae, J. Ko, Y. B. Ko, “Adaptive frame skipping with screen dynamics for mobile screen sharing applications,” IEEE Systems Journal, PP(99), 1-12, 2016.
  26. Feng, Z. G. Li, L. Keng Pang, G. N. Feng, “Reducing frame skipping in MPEG-4 rate control scheme,” 2002 IEEE International Conference on Acoustics, Speech, and Signal Processing, Orlando, FL, 2002.