Hello Fellow Yuneec Pilot!
Join our free Yuneec community and remove this annoying banner!
Sign up

Building a custom Camera

m99

Joined
Jan 17, 2023
Messages
11
Reaction score
2
Age
45
Hi
If i understand correct the camera for the drone is communicating over wifi.
So then a may be able to build a custom camera using a raspberry ore similar.
But to get started, a need some help how the communication works between the drone and the camera
Today I understand the drone connects to the wifi of the camera and gets an IP address (can't remember the address will lookup)
the password for the camera was 0123456789
Then the drone properly starts using some stream for the camera, and that is sent back to the st16 controller.
control starting and stopping video may be done with some HTTP calls.Also the gimbal has some connector for power and control of the gimbal anyone knows what the connectors are and what kind of data they send ?
 
  • Like
Reactions: Yuneec Skins
The topic was discussed in a few threads. Use "search" before asking, please.
 
I did see your reply deep ins some tread on page7 but it was hard to follow the progress and the topic was something else like "bind anything tp st16 ?"
Can we try to add info only about the camera here ?
Then I can make a repo for it later and share code
 
Ask with quotations from yet-posted facts. Very difficult for me will be to rewrite everything.
 
The CGO3+ camera from Typhoon H (H480) will be controlled by CGI commands to IP address 192.168.42.1.
More info about this in doc section here: GitHub - h-elsner/CGO3control: Tool extracted from Q500log2kml
The progamm itself can be used as test-tool.
The video downlink from camera is RTSP protocol.
Media files:http://192.168.42.1/DCIM/100MEDIA/
Live stream:rtsp://192.168.42.1/live

The gimbal gets data via UART and provides +5V for the camera. Camrea is connected to UART too, but I don't know what is going on there.
For Gimbal connectors the best source is https://yuneecpilots.com/attachments/wtfdproject-rev89-pdf.27755/

br HE
 
  • Like
Reactions: amplifier
Thanks this is what im looking for.
Will make repo and so if I can make a raspberry camera module.
Will share GitHub repo here later
 
Possible instead...
Probably useless to the people. I tried, but no wow effect was produced.


No reason to share my sources until I didn't see at least minimal individual progress. Here was a guy, who wants to have success but he doesn't listen to simple precautions and advice. So, the result was a CGI-bin script, based on the nodeJS, which is useful in some steps in the project but for sure isn't mandatory for the first time.

Read all posted at least a few times. Here is all the necessary information for a successful Pi cam Yuneec project.
 
Thanks this is what im looking for.
Will make repo and so if I can make a raspberry camera module.
Will share GitHub repo here later

I will be keeping an eye out for this repo. I spent some time trying to get a raspberry Pi to emulate a CGO3+ and/or MK58 video transmitter module. I was able to get it to work for displaying on a ST10+ controller but the lag was horrible (~2.5sec). I always felt that was because I don't really know linux very well and was only bashing it together from various non-related tutorials that I found.

I think I got it working (with the 2.5sec of lag) on the ST16 also, but I dont' remember for sure.

This is a link to a post where I was using node.js to respond to the initialization requests from the ST16/ST10 to the raspberry pi.

 
  • Like
Reactions: h-elsner
This might be stuff you all already know, but just in case, I'll cover it anyway.

Lag in network transmitted video is almost always down to how big of a buffer you use. For example, if you connect to one of the Yuneec cameras using VLC, by default it will be somewhere around 2-5 seconds behind due to the default buffer size in VLC. The exact time is variable because it's a memory size buffer not time - higher quality video will have less lag due to the video data taking more space per second. The official Yuneec apps (the flight one and the random CGO apps) use a very small buffer to make them appear more instant.

Same goes for the server (camera) side. If you're trying to fake a camera using existing software, you need to remember that some (or most) video streaming software will assume you're going to transmit over the internet, and also that quality and linearity is more important than latency. That means they will use a larger buffer to smooth out any internet congestion that will cause spikes in latency. Reducing this buffering and disabling any features that are intended to improve quality in wobbly networks will reduce the latency in the video shown on the client (whether that's VLC or an official Yuneec app).


Edit: I'm also interested in this. Not really for anything flight related. Just from the technical "how it works" perspective.
 
  • Like
Reactions: h-elsner
Hi
Yes I have been thinking some of this. I do some FPV flights with my home build quads and there you need fast images rate.

For this, I have now a RTSP server running. And then make 1-X cams connect to the rtsp server. Then I plan to rotate cams to the /live endpoint (So you can switch between cams)
Taking photos and videos are done by rtsp clients that read the stream from the rtsp server. So you would take photos from all cams connected.

To get a good frame rate to the controller, I can work on the rtsp server and reroute streams from the cams and change them to see if I can get good images speed.
So if the cams are connected to rtsp://1/video0 with high quality I can read the stream and lower the quality and add the new stream to the /live endpoint that the controller reads.
But we'll see if it works :-)


But the goal for me is not flying through the controller.
I want to see if I can add more cams and do automated flights to do mapping work with the drone. And then, if there is any delay of images to the drone is not a issue.

I have added the CGI calls and the video works on rtsp. I have also fixed the network setup so I will try to see if I can connect it to the controller later this week.Have a work with the drone now that need to be done first. I have build it around docker so you can run a docker to start up the tools.

There are no raspberry left so I used a rock4 Rock4 - Radxa Wiki and the base system is Armbian.
 
  • Like
Reactions: h-elsner
Hi
Have the base running, but the st16 is not getting the rtsp stream a have searched here and tried to find the right ffmpeg settings.
But I guess maybe some here have the commands used to send the stream to the controller.


The controller also sends a command get_bind_status but I guess I can ignore that ?

192.168.42.52 - - [01/Feb/2023 21:18:45] "GET /cgi-bin/cgi?CMD=GET_STATUS HTTP/1.1" 200 -
yuneec-camera-api-1 | 192.168.42.52 - - [01/Feb/2023 21:18:48] "GET /cgi-bin/cgi?CMD=get_bind_state HTTP/1.1" 200 -


Here is the log when the st16 starts to try to read the stream

yuneec-camera-rtsp-1 | 2023/02/01 21:18:45 DEB [RTSP] [conn 192.168.42.52:46452] [c->s] PLAY rtsp://192.168.42.1:554/live RTSP/1.0
yuneec-camera-rtsp-1 | CSeq: 5
yuneec-camera-rtsp-1 | Range: npt=0.000-
yuneec-camera-rtsp-1 | Session: 7da23749-b2bb-4cd2-9a89-7a05d31c69d0
yuneec-camera-rtsp-1 | User-Agent: Yuneec RTSP protocol
yuneec-camera-rtsp-1 |
yuneec-camera-rtsp-1 |
yuneec-camera-rtsp-1 | 2023/02/01 21:18:45 INF [RTSP] [session cf6fe69b] is reading from path 'live', with UDP, 1 track (H264)
yuneec-camera-rtsp-1 | 2023/02/01 21:18:45 DEB [RTSP] [conn 192.168.42.52:46452] [s->c] RTSP/1.0 200 OK
yuneec-camera-rtsp-1 | CSeq: 5
yuneec-camera-rtsp-1 | RTP-Info: url=rtsp://192.168.42.1:554/live/mediaUUID=e62ce6ee-b846-4a95-8c17-4af841918287;seq=14908;rtptime=1300561356
yuneec-camera-rtsp-1 | Server: gortsplib
yuneec-camera-rtsp-1 | Session: 7da23749-b2bb-4cd2-9a89-7a05d31c69d0;timeout=60
yuneec-camera-rtsp-1 |
yuneec-camera-rtsp-1 |
yuneec-camera-rtsp-1 | 2023/02/01 21:18:45 DEB [RTSP] [conn 192.168.42.52:46452] [c->s] OPTIONS rtsp://192.168.42.1:554/live RTSP/1.0
yuneec-camera-rtsp-1 | CSeq: 6
yuneec-camera-rtsp-1 | User-Agent: Yuneec RTSP protocol
yuneec-camera-rtsp-1 |
yuneec-camera-rtsp-1 |
yuneec-camera-rtsp-1 | 2023/02/01 21:18:45 DEB [RTSP] [conn 192.168.42.52:46452] [s->c] RTSP/1.0 200 OK
yuneec-camera-rtsp-1 | CSeq: 6
yuneec-camera-rtsp-1 | Public: DESCRIBE, ANNOUNCE, SETUP, PLAY, RECORD, PAUSE, GET_PARAMETER, TEARDOWN
yuneec-camera-rtsp-1 | Server: gortsplib

So help what codec and settings did you use to stream to st16 with ffmpeg ?
 
When I understand right, the ST16 reject connection because it thinks it is not bound. That's why it sends bind command after GET_STATUS.
What answer did you sent after GET_STATUS?

An example for CGO3 you can find here:
But can test it for CGO3+ using the tool I mentioned above.

1675322449630.png

br HE
 
Hi Thanks, for the fast reply !

I followed the spec from that repo and I can see on the st16 that is reads the value and set an example the sdcard info to the values from the GET_STATUS.
So the GET_STATUS is working.
Does the return here need to match the video stream? here I return "video_mode":"3840x2160F30", but the stream is only HD format ?


Look like your resonde to GET_STATUS is different from mine bellow ?



def GET_STATUS(data):
MODE = data.get('MODE')
print(MODE)
returnData={
"rval":0,
"msg_id":1,
"cam_mode":"1",
"status":"vf",
"sdfree":"15191040",
"sdtotal":"15549440",
"record_time":"0",
"white_balance":"0",
"ae_enable":"1",
"iq_type":"1",
"exposure_value":"0.0",
"video_mode":"3840x2160F30",
"awb_lock":"0",
"audio_sw":"1",
"shutter_time":"60",
"iso_value":"ISO_1600",
"photo_format":"dng"}
return returnData


// Matte
 
A so there can be diffrent GET_STATUS messages based on the camera ?
 
Here are the results from my CGO3+ for INDEX_PAGE and GET_STATUS. INDEX_PAGE is for initialization of the camera what I have understand.

JSON:
INDEX_PAGE: {
  "rval": 0,
  "msg_id": "257",
  "param": "1",
  "fw_ver": "3.2.34(E)",
  "cam_mode": "1",
  "status": "vf",
  "iq_type": "1",
  "white_balance": "0",
  "sdfree": "26250240",
  "sdtotal": "31154176",
  "exposure_value": "0.0",
  "video_mode": "4096x2160F25",
  "speed_rate": "54M auto",
  "record_time": "0",
  "awb_lock": "0",
  "ae_enable": "1",
  "audio_sw": "1",
  "shutter_time": "30",
  "iso_value": "ISO_600",
  "photo_format": "jpg",
  "audio_enable": "1",
  "rtsp_res": "720P",
  "photo_mode": "1",
  "photo_num": "1",
  "photo_times": "1",
  "ev_step": "0.000000",
  "interval_ms": "333",
  "soft_ver": "30234",
  "cam_scene": "0",
  "left_time": "0",
  "metering_mode": "2",
  "x_ratio": "0.00",
  "y_ratio": "0.00",
  "layers": "1",
  "pitch": "0",
  "yaw": "60",
  "timer_photo_sta": "0"
}


GET_STATUS: {
  "rval": 0,
  "msg_id": 1,
  "cam_mode": "1",
  "status": "vf",
  "sdfree": "26250240",
  "sdtotal": "31154176",
  "record_time": "0",
  "white_balance": "0",
  "ae_enable": "1",
  "iq_type": "1",
  "exposure_value": "0.0",
  "video_mode": "4096x2160F25",
  "awb_lock": "0",
  "audio_sw": "1",
  "shutter_time": "30",
  "iso_value": "ISO_600",
  "photo_format": "jpg",
  "rtsp_res": "720P",
  "photo_mode": "1",
  "photo_num": "1",
  "photo_times": "1",
  "ev_step": "0.000000",
  "interval_ms": "333",
  "cam_scene": "0",
  "audio_enable": "1",
  "left_time": "0",
  "metering_mode": "2",
  "x_ratio": "0.00",
  "y_ratio": "0.00",
  "layers": "1",
  "pitch": "0",
  "yaw": "60",
  "timer_photo_sta": "0"
}

video_mode is what the camera is recording, not video downlink via RTSP.
Note: Upper/lower case in commands and responses have influence!

br HE
 
  • Like
Reactions: DoomMeister
Thanks I got longer now and did se this POST now

"POST /cgi-bin/cgi?CMD=request_bind&client_mac_address=e0:b6:f5:80:2e:99 HTTP/1.1" 405


Does the st16 have the IP of 2 in the end ? I have a DHCP server now pushing out ips
 
Last edited:
Something with the GET_STATUS digging more in that I guess


Code:
yuneec-camera-rtsp-1     | 2023/02/03 13:09:02 INF [RTSP] [session 9f786fb2] created by 192.168.42.52:35655
yuneec-camera-rtsp-1     | 2023/02/03 13:09:02 INF [RTSP] [session 9f786fb2] is reading from path 'live', with UDP, 1 track (H264)
yuneec-camera-api-1      | 192.168.42.52 - - [03/Feb/2023 13:09:05] "GET /cgi-bin/cgi?CMD=GET_STATUS HTTP/1.1" 200 -
yuneec-camera-api-1      | 192.168.42.52 - - [03/Feb/2023 13:09:06] "GET /cgi-bin/cgi?CMD=GET_STATUS HTTP/1.1" 200 -
yuneec-camera-rtsp-1     | 2023/02/03 13:09:08 INF [RTSP] [session 9f786fb2] destroyed (teared down by 192.168.42.52:35655)
yuneec-camera-rtsp-1     | 2023/02/03 13:09:08 INF [RTSP] [conn 192.168.42.52:35655] closed (EOF)
yuneec-camera-rtsp-1     | 2023/02/03 13:09:09 INF [RTSP] [conn 192.168.42.52:39845] opened
yuneec-camera-rtsp-1     | 2023/02/03 13:09:09 INF [RTSP] [session d8d90a74] created by 192.168.42.52:39845
yuneec-camera-rtsp-1     | 2023/02/03 13:09:09 INF [RTSP] [session d8d90a74] is reading from path 'live', with UDP, 1 track (H264)
yuneec-camera-rtsp-1     | 2023/02/03 13:09:15 INF [RTSP] [session d8d90a74] destroyed (teared down by 192.168.42.52:39845)
yuneec-camera-rtsp-1     | 2023/02/03 13:09:15 INF [RTSP] [conn 192.168.42.52:39845] closed (EOF)
yuneec-camera-rtsp-1     | 2023/02/03 13:09:15 INF [RTSP] [conn 192.168.42.52:59325] opened
yuneec-camera-api-1      | 192.168.42.52 - - [03/Feb/2023 13:09:15] "GET /cgi-bin/cgi?CMD=GET_STATUS HTTP/1.1" 200 -
yuneec-camera-rtsp-1     | 2023/02/03 13:09:15 INF [RTSP] [session 9c2e6e87] created by 192.168.42.52:59325
yuneec-camera-rtsp-1     | 2023/02/03 13:09:15 INF [RTSP] [session 9c2e6e87] is reading from path 'live', with UDP, 1 track (H264)
yuneec-camera-api-1      | 192.168.42.52 - - [03/Feb/2023 13:09:16] "GET /cgi-bin/cgi?CMD=GET_STATUS HTTP/1.1" 200 -
 
Hi
If i understand correct the camera for the drone is communicating over wifi.
So then a may be able to build a custom camera using a raspberry ore similar.
But to get started, a need some help how the communication works between the drone and the camera
Today I understand the drone connects to the wifi of the camera and gets an IP address (can't remember the address will lookup)
the password for the camera was 0123456789
Then the drone properly starts using some stream for the camera, and that is sent back to the st16 controller.
control starting and stopping video may be done with some HTTP calls.Also the gimbal has some connector for power and control of the gimbal anyone knows what the connectors are and what kind of data they send ?
Correction; the Camera Password is: 1234567890
 
The ST16 gets the IP address from camera, same with my PC if I connect the SSID of the camera. The address I get is alway 192.168.42.2. The binding to my PC is done the normal way as for any other WiFi network (SSID + password).
The strange thing I do not understand is: When I send the command "get_bind_state" the answer from camera is: "{"isbinded":"no", "binded_client_address":"ff:ff:ff:ff:ff:ff" }" and the connection to camera CGI is lost. I need to sent "GET_INDEX" again to intialize the camera for further communication. Could it be that ST16 is checking for proper MAC address or against SSID that contains a part of the MAC? I will have a look into ST16 source code but I'm not so good with that Java stuff. May take a bit longer....

br HE
 

New Posts

Members online

No members online now.

Forum statistics

Threads
21,184
Messages
244,285
Members
27,957
Latest member
sowndaryas