Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

metadata broker feedback #215

Open
paromarc opened this issue Mar 15, 2024 · 22 comments
Open

metadata broker feedback #215

paromarc opened this issue Mar 15, 2024 · 22 comments
Labels
BETA API Questions and feedback on BETA APIs

Comments

@paromarc
Copy link

Hi,

I have been playing around with the metadata broker API and wanted to report some feedback.

Starting with the API docs --> https://axiscommunications.github.io/acap-documentation/docs/api/src/api/metadata-broker/html/standard_topics.html

The payload doesn't start with the "data" property --> {
"data": {
"frame": {
"timestamp": "2023-03-08T09:00:19.320111Z",
"observations": [

but instead from the "frame" property
"frame": {
"timestamp": "2023-03-08T09:00:19.320111Z",
"observations": [

I see my P3265 fw 11.9.60 is detecting Human, Cars but also Face.

In my app I am parsing the json payload --> ** Message: 12:59:31.623: type Face: bottom 0.127300, left 0.208600, right 0.224700, top 0.098600

I have also been playing around with the 'BestShot' feature and uploading the base64 image to an S3 bucket for testing.
My P3265 is facing a display looping the same image, but the best shot seems to be inconsistent. Sometimes I get the back of my body, sometimes the front, sometimes just my face (even though my app is specifying only 'human' type and not 'face').

I am attaching a quick video showing the behavior after watching the same video twice and getting different results.

Thanks!
Marco

MetadataBroker_BestShot.mp4
@pataxis
Copy link
Contributor

pataxis commented Mar 21, 2024

Hi @paromarc , thanks for your feedback! Will send this to the team working with this API.

@pataxis pataxis added the BETA API Questions and feedback on BETA APIs label Apr 9, 2024
@kevin-barudde-work
Copy link

Hi i will summarise the meting notes here for other that have the same questions,

  1. The documentation was wrong and is now fixed. (The data field was removed)
  2. You can get other classes than what the documentation example show
  3. The best snapshot feature is not deterministic for tracks

@Cacsjep
Copy link

Cacsjep commented Jan 24, 2025

Hey,

My camera has two producers, according to vapix list producers.
VideoMotionTracker and AnalyticsSceneDescription (ASD)

Since there are only topic for ASD I can only subscribe to this so the questions would be for ASD.

Questions:
1.Are this model runs 24/7 or only when I subscribe?
2. The accuracy of the scene model feels unreliable, as it often loses track when I remain seated in the scene for an extended period. Are there any improvements planned?

Thanks

@kevin-barudde-work
Copy link

Hi @Cacsjep!

  1. The model only runs when someone want to use it, for example you subscribing
  2. Currently stationary object in the scene are filtered

@Cacsjep
Copy link

Cacsjep commented Jan 27, 2025

@kevin-barudde-work thank you for your fast Response.

  1. perfect made sense
  2. In such a case, should "Consolidated" be used instead? Also, I’ve never come across this in the documentation—did I miss it, or is it just undocumented? this would imply that when i start subscription and there is a car or person in scene this is not detected because its stationary right? Is there a plan to made this opt out?

Thank u

@sguaqueta-work
Copy link

sguaqueta-work commented Jan 27, 2025

Hi @Cacsjep!

The information about the data payload only including moving objects is documented in the integration guidelines, but it should be more explicitly stated in other places as well. Thank you for the feedback.

Image

The payload of consolidated metadata only includes moving items as well. Having said that, there are plans to include stationary objects in the output. May I ask you to share a bit more about the use case you are working with? Is there anything else apart from stationary objects that is missing?

@Cacsjep
Copy link

Cacsjep commented Jan 27, 2025

Hey @sguaqueta-work,

Thanks for clarifying that this currently only tracks moving objects. It wasn’t entirely clear to me initially, so updating the documentation would definitely be helpful.

It’s exciting to hear about plans to include stationary objects as well! Right now, I’m just estimating MDB and was curious why it loses tracks. Adding stationary object tracking would open up a lot of possibilities, like counting specific object classes in a scene, among other things.

I believe it would be great to give developers the flexibility to choose between tracking only moving objects or including stationary ones as well.

Do you have any information on when the MDB will move out of beta?

Thanks a lot!

@Cacsjep
Copy link

Cacsjep commented Jan 27, 2025

Is there a plan in future for a radar topic to subscribe?

@rywager
Copy link

rywager commented Feb 4, 2025

Is there a plan to allow us to use MDB to detect only specific items like a face, and then be able to use a lower res larod and save resources, thus hopefully increasing it's efficacy?

Because as it stands, the latest MDB is very spotty with coverage, sometimes it reports a moving face then it ignores others, you restart and you get a burst of them, then it goes away etc..

I always notice whenever "top" load starts to grow to say 2+, or when vmem goes up is when you don't see anything from the MDB which makes it very unreliable.

@sguaqueta-work
Copy link

Hello @rywager ,
Thank you for reporting these issues. We would like to obtain some additional information from you to better understanding what might be causing the behaviors described. Could you please create a support case and include a complete server report from your device?

We are interested in obtaining some details from the device such as the camera model, AXIS OS version, and an image of the scene to understand the environment and placement of the camera.

@Cacsjep
Copy link

Cacsjep commented Feb 10, 2025

Hey Axis Team,

After visualizing the observations using axoverlay, I noticed a significant delay. Initially, I thought I had made a mistake, as the metadata visualization in the Web UI seemed far more accurate. To investigate further, I obtained the same data via WebSocket and rendered it using the Axis Media Stream library, but the results were not significantly better. Across all tests, I received approximately 10 message frames per second from the MDB, whether through WebSockets or the C API.

It seemed strange that the Axis Web UI appeared more precise. To compare, I opened two windows—one showing the stream via the "installation" section in the Web UI and the other displaying the metadata visualization. I observed that the metadata visualization has higher latency. It seems as though an intentional delay (~260ms) is added to reduce the gap between MDB observations and overlay rendering. Additionally, the MDB observation timestamps exhibit an inherent latency of ~600ms (current time vs. observation timestamp).

Could you confirm the following:

  1. Is the estimated latency of ~260ms for the metadata visualization intentional, and is it implemented to align observations with the overlay? If so, documenting this would be very helpful for those implementing custom overlays.
  2. Does the MDB operate at a rate of approximately 10 message frames per second? If so, adding this to the documentation would also be useful.
  3. What is the approximate latency of an MDB observation (from when it’s generated to when it’s sent)?

Understanding these aspects better would clarify the behavior of the system and help others avoid similar confusion when building cool apps with MDB =).

BR Thanks

@dstafx
Copy link

dstafx commented Feb 11, 2025

Hi @Cacsjep, sorry for the delay.
All products that include radar support such as security radars and radar-video fusion cameras does have radar metadata on the com.axis.analytics_scene_description.v0.beta topic as well starting from firmware release 12.2.
For consumers only interested in radar data (no video detections) the topic com.axis.radar.analytics_scene_description.v0.beta is also available on radar devices.

@Cacsjep
Copy link

Cacsjep commented Feb 11, 2025

@dstafx Thank you, could you please tell me if MDB can work with view areas ?

The source corresponds to the producer channel and not the channel you can set via VDO right?

Image so I am correct that MDB cant work with view areas ?

@kevin-barudde-work
Copy link

kevin-barudde-work commented Feb 12, 2025

Hi @Cacsjep,

I theory MDB can work with view areas by setting the source, as it refers to a producer channel and not VDO.

But, AXIS Scene Metadata dose not produce any data for different view areas.

NOTE: The image you sent is for configuration of the RTSP stream so will have no effect on MDB.

@Cacsjep
Copy link

Cacsjep commented Feb 12, 2025

@kevin-barudde-work

Ok, thanks, has this been tested? Because when I checked it against another source (view-area), I don't get any MDB observations.
I just created a second view area with full image size and set the .source = "2".
We thought about using it in magnified viewing areas to potentially achieve better results. But now I think that wouldn't make a difference anyway, because if MDB doesn't detect anything, it wouldn't detect anything in a view area either.

I was confused because VAPIX also showing only one video channel when there are multiple view areas. Is this video channel in the producer response also a channel like in VDO or is that different, like image sensor ?
Image

thank you for clarification

@kevin-barudde-work
Copy link

@Cacsjep

You will not get any data via MDB because AXIS Scene Metadata is not produced for view areas. If we started supporting AXIS Scene Metadata for view areas, then you would use "source" to get access to it.

Today source is only used different image sensors, that why you only get one.
NOTE: "listProducers" is for RTSP producers and is different from the producers on MDB.

@Cacsjep
Copy link

Cacsjep commented Feb 12, 2025

@kevin-barudde-work

thank you now circle is closed =)

BR christoph

@Cacsjep
Copy link

Cacsjep commented Feb 12, 2025

@kevin-barudde-work any info about this #215 (comment) ?

Thx

@kevin-barudde-work
Copy link

@Cacsjep

Sorry for the wait.

1:
It would be beneficial to add some documentation of how to sync the object tracking data with video. Regarding the delay, it's as you say, there's a ~250-500ms gap caused by syncing video and metadata.

2:
The object tracking data on the topic com.axis.analytics_scene_description.v0.beta is produced approximately 10 times a second in most cases, there are some variations between camera models.
We are working on improving the documentation.

3:
The message broker does not add any significant delay. But currently the algorithm producing the data on the topic com.axis.analytics_scene_description.v0.beta will usually have a processing delay of between 0.5 and 1 second, but there are no guarantees.

@Cacsjep
Copy link

Cacsjep commented Feb 20, 2025

@kevin-barudde-work

Thank you, so overall, a mdb observation could be 0.5 to 1 sec out of sync compared to video, because of the actual algorithm correct.

Is there also a plan tobe able to configure min. track movement. tobe able to detect objects with small movements ?

@kevin-barudde-work
Copy link

@Cacsjep

Yes, observations are usually 0.5 to 1 sec out of sync compared to video. Depending on the device the out of sync range may differ somewhat.

Objects with no/small movements would be considered an stationary object and would be part of the plan to include stationary objects.

@Cacsjep
Copy link

Cacsjep commented Feb 21, 2025

@kevin-barudde-work perfecto thank you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
BETA API Questions and feedback on BETA APIs
Development

No branches or pull requests

7 participants