Hello Fellow Yuneec Pilot!
Join our free Yuneec community and remove this annoying banner!
Sign up

Omega, Phi, Kappa ?

Joined
Jun 11, 2018
Messages
101
Reaction score
25
Is there any way to get the values of Omega, Phi, Kappa from H520 cams ?
(these are X,Y,Z angles of the camera) In short it indicates where, the cam is looking to.
It may be usefull for Photogrametry to know these angles relative to drone of course.
 
Is there any way to get the values of Omega, Phi, Kappa from H520 cams ?
(these are X,Y,Z angles of the camera) In short it indicates where, the cam is looking to.
It may be usefull for Photogrametry to know these angles relative to drone of course.

Defining relative or absolute accuracy, with GPS coordinates and camera placement reports, as well as alterations of XYZ can easily be done. Data report: All app software can generate a quality report Number to tiles, densified, average, and coordinates.

Software that will allow you to redefine or edit XYZ parameters is Recap Pro, photoscan, geosetter. However, the angles Omega, Phi, Kappa does not save in XML, you need geosetter to help you compute the parameters in XML and use them together with the XYZ value in the camera as XML file and "save as" WHEW!!!

I at times map manually, when I am bored. Mapping manually, meaning flying without the aid of a mapping app. bbecause that is cheating. Litchi app also came out with a list of cameras are a reference that you can use in order to load to an offsite program, such as Recap Pro or Photoscan (I prefer Recap Pro. it is easier to use than photoscan, when it comes to editing or assigning Omega, Phi, Kappa values)


If you are confused, with my explanations... Bottom line is, what you are asking has been done, just follow the crumbs left by the guys that are doing agriculture mapping.
 
Thank you for explanations, @RPR in short there's no way to get them ( Omega, Phi, Kappa) but they can be derived from XYZ and GPS coord. of course using reference points and triangulations computed from software, easy, that's the method I use in Pix4D, but it can be tedious. That's why, some help getting these values from Yuneec could be nice and speed up the process.
 
I at times map manually, when I am bored. Mapping manually, meaning flying without the aid of a mapping app. bbecause that is cheating. Litchi app also came out with a list of cameras are a reference that you can use in order to load to an offsite program, such as Recap Pro or Photoscan (I prefer Recap Pro. it is easier to use than photoscan, when it comes to editing or assigning Omega, Phi, Kappa values)
Manual mapping is easy enough and is my preferred method when flying close to subjects or hazards.
 
Obviously when Yuneec schedules a POI monitoring function, it has to have access to that information, it is absolutely necessary. Another thing is that it does not make it possible for us to have access to these variables because they have considered that they are not necessary for them to be public. I imagine that for the Partners they will have them available. It is also a question of investigating a little more and checking the advanced mode of the ST16S and the list of available variables. In the same way you can find out what variables they are.
 
Manual mapping is easy enough and is my preferred method when flying close to subjects or hazards.


The manual mapping that, I am pertaining to is by flying the sUAS in a pattern where it takes, not photos, but a video and later translate that into photos and give them corresponding coordinates via telemetry. I find that taking photos is cheating and time-consuming.


Not a lot of folks understand this, but this technique goes way back when agriculture mappers paved the way to aerial mapping. I have been talking to another operator in the commercial pilot forum that has still used the old method, but he has been successful in creating an accurate phyton script.

 
Thank you for explanations, @RPR in short there's no way to get them ( Omega, Phi, Kappa) but they can be derived from XYZ and GPS coord. of course using reference points and triangulations computed from software, easy, that's the method I use in Pix4D, but it can be tedious. That's why, some help getting these values from Yuneec could be nice and speed up the process.

You might find answers in DroneMapper there is an extensive process that was discussed that dates back to ArduPilot.
 
Obviously when Yuneec schedules a POI monitoring function, it has to have access to that information, it is absolutely necessary. Another thing is that it does not make it possible for us to have access to these variables because they have considered that they are not necessary for them to be public. I imagine that for the Partners they will have them available. It is also a question of investigating a little more and checking the advanced mode of the ST16S and the list of available variables. In the same way you can find out what variables they are.

I don’t have a Yuneec “commercial drone” H520, but man!!!!!! This simple parameter values should be extractable, because my inspire 1 and inspire 2 does not disclose the OPK image geolocation values.
 
  • Like
Reactions: Sarah

Looking through the sdk it should be possible.
When a command is sent to the gimbal the result is sent back with the class gimbal.result

public static void asyncSetPitchAndYawOfJni (float pitch, float yaw, Gimbal.ResultListener listener)

set the camera pitch and yaw
Parameters
pitchpitch angle of the gimbal from 0 (level) to -90 (downward facing) in degrees
yawyaw angle of the gimbal from 0 (forward looking) in the body frame in degrees
listenerthe listener of the result callback
 
  • Like
Reactions: arruntus
Looking through the sdk it should be possible.
When a command is sent to the gimbal the result is sent back with the class gimbal.result
public static void asyncSetPitchAndYawOfJni (float pitch, float yaw, Gimbal.ResultListener listener)

and roll ?

See "other models of drone" what data is produced into Pix4d, with Omega, Phi and Kappa available, althought Omega is usually close to zero.

In example attached Omega and Phi are close to zéro, but Kappa has turned from -90 to 90, so it must be a passing by flight loking, at horizon.

So the great trick would be to add these values to the exifs... of photos... if available.
 

Attachments

  • .jpeg
    .jpeg
    340.3 KB · Views: 12
You're wrong. If, if it sends the data well, but in China and more without your knowledge ......View attachment 16900

You’re knowledge in mapping is now obvious. [emoji23]

OPK values is the backbone of mapping and all platforms has it, even our phones. If YUNEEC does not allow the user to read this data. Yuneec is insecure of their camera mediocre values? [emoji23]
 
I know .. that's why I've got the best brand.

@RPR Not talking about inspire 1 or 2 ... another wild guess ?

You actually do not have the best brand, not until Yuneec wear big balls in quality.
Yuneec can be the next best thing, but we are not their priority.
 
and roll ?

See "other models of drone" what data is produced into Pix4d, with Omega, Phi and Kappa available, althought Omega is usually close to zero.

In example attached Omega and Phi are close to zéro, but Kappa has turned from -90 to 90, so it must be a passing by flight loking, at horizon.

So the great trick would be to add these values to the exifs... of photos... if available.

Ohhh hey :) you got it
 
You actually do not have the best brand, not until Yuneec wear big balls in quality.
Yuneec can be the next best thing, but we are not their priority.

Please don't drive me into a comparison of drones... that's not my question. Indeed Yuneec has probably had this question asked ( and after ArduPilot) So no Omega, Phi, Kappa for now...
 
Please don't drive me into a comparison of drones... that's not my question. Indeed Yuneec has probably had this question asked ( and after ArduPilot) So no Omega, Phi, Kappa for now...

I did not look at your posted photo correctly.

The OPK is there... Or else mapping would not be possible with DD and PIX4D specially with material caching accuracy. And I do not want to get an H520 to find out.

Have you tried asking in GitHub PX4 forum?
 
I did not look at your posted photo correctly.

The OPK is there... Or else mapping would not be possible with DD and PIX4D specially with material caching accuracy. And I do not want to get an H520 to find out.

Have you tried asking in GitHub PX4 forum?
That's an example from another brand....
 
Looking through the sdk it should be possible.
When a command is sent to the gimbal the result is sent back with the class gimbal.result

public static void asyncSetPitchAndYawOfJni (float pitch, float yaw, Gimbal.ResultListener listener)

set the camera pitch and yaw
Parameters
pitchpitch angle of the gimbal from 0 (level) to -90 (downward facing) in degrees
yawyaw angle of the gimbal from 0 (forward looking) in the body frame in degrees
listenerthe listener of the result callback

Well, I looked for it and I didn't find it, well done mate :D

and roll ?

In photogrammetry I understand that you do not need the rool because the gimbal is responsible for maintaining it at all times in horizontal automatically. Even the Yaw should be at 0 degrees at all times if the gimbal corrects correctly, the only variable to take into account should not be the Pitch? or influences the behavior of the aircraft as to take it into account?
 
  • Like
Reactions: RPR

New Posts

Members online

Forum statistics

Threads
20,986
Messages
241,901
Members
27,422
Latest member
Polenutrition