Kineto
-
The necessity of Kineto’s 360-degree camera
-
As mentioned above, it streams video from the hub devices in each classroom.
-
Different from other environments:
- A mix of live streaming (real-time) and on-demand environments.
- How to switch between them.
-
Is it necessary for students to have a hub?
- If you want bidirectional communication between students, then yes.
-
- Lag is severe.
- It seems to be easy to keep archived videos.
-
- Ultra-low latency at a level where conversation is possible, but the video quality decreases (adaptive).
- It seems to be challenging for simultaneous streaming of hundreds of people or more, but in a classroom environment, that is unlikely, right?
-
Well, in a normal class, that might be true, but I actually thought it might be interesting if there were hundreds of people connected and there was always someone to talk to, even if they were in different time zones.
- spatial.chat determines communication and discussion partners based on a two-dimensional position.
- kineto determines them based on the temporal position, not the physical location. Pretty interesting, huh?
-
Actually, even in a school environment, it might be possible for a whole-school assembly level.
- At that level, there may not be a need for real-time communication between the broadcasting party and all the receiving parties.
-
- If it’s done locally in a school, can it be easily done peer-to-peer?
-
WebRTC to HLS https://medium.com/@voluntas/webrtc-to-hls-の可能性-fb1e847c9537
-
When manipulating time, it seems like we have to do something clever like WebRTC to HLS when real-time is needed and WebRTC to WebRTC when it is not.
- Send the video to the server using SkyWay Gateway, and then send it to the HLS server from there.
- It seems like it could be expensive and scary.
- AWS Kinesis Video Streams is also an option.
- https://qiita.com/yusuke84/items/73319b9a1dbc61b27c0f
- It’s mentioned here (in the article by the Skyway person).
- It would be nice if the iOS SDK for WebRTC is written in Swift (Skyway is Objective-C).
- Send the video to the server using SkyWay Gateway, and then send it to the HLS server from there.
-
It might be good to use Zoom for streaming.
- Advantages:
- Can use screen sharing, etc.
- Familiar operations.
- No need to develop on the sender side.
- Various options like HLS, RTMP, etc. https://devforum.zoom.us/t/how-do-i-get-an-output-from-zoom-as-an-rtmp-push-pull-hls-rtp-for-aws-elemental-media-connect/15586
- Disadvantages:
- Inconvenient configuration of the streaming destination URL.
- Video quality.
- Advantages:
-
-
Changed to HLS.
- It seems that the type must be set to “event” to be able to view past videos. If it is set to “live”, past videos cannot be viewed.
- This can be configured in nginx.conf.
- https://stackoverflow.com/questions/27518519/how-do-you-enable-video-seeking-in-safari-using-http-live-streaming
- Just opening the m3u8 file in Safari is not enough. This needs to be done to enable event support.
- It seems that the type must be set to “event” to be able to view past videos. If it is set to “live”, past videos cannot be viewed.
-
https://note.com/chitapapa/n/n49f5b7b635c4
- A page with various summaries.
- It would be good to read the page of the person from SFU Sora when you have gained more knowledge. - There is a lot of material about WebRTC. - https://voluntas.github.io/ - Getting Started with Real-Time Video Streaming https://gist.github.com/voluntas/076fee77f30a0ca7a9b9 - Getting Started with WebRTC https://gist.github.com/voluntas/67e5a26915751226fdcf
- ImageFlux Live Streaming.
- There are also various things about HLS.
- I should take a good look at the slides that are included.
- A page with various summaries.
-
How to process when displaying video with AVPlayer?
-
Issue with AVPlayer not working.
- Examples:
- Akito, Mei
- Examples: