Towards Longer Long-Range Motion Trajectories

Although dense, long-range, motion trajectories are a prominent representation of motion in videos, there is still no good solution for constructing dense motion tracks in a truly long-range fashion. Ideally, we would want every scene feature that appears in multiple, not necessarily contiguous,...

Full description

Bibliographic Details
Main Authors: Rubinstein, Michael, Liu, Ce, Freeman, William T.
Other Authors: Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Format: Article
Language:en_US
Published: British Machine Vision Association 2015
Online Access:http://hdl.handle.net/1721.1/100283
https://orcid.org/0000-0002-3707-3807
https://orcid.org/0000-0002-2231-7995
_version_ 1811085330102616064
author Rubinstein, Michael
Liu, Ce
Freeman, William T.
author2 Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
author_facet Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Rubinstein, Michael
Liu, Ce
Freeman, William T.
author_sort Rubinstein, Michael
collection MIT
description Although dense, long-range, motion trajectories are a prominent representation of motion in videos, there is still no good solution for constructing dense motion tracks in a truly long-range fashion. Ideally, we would want every scene feature that appears in multiple, not necessarily contiguous, parts of the sequence to be associated with the same motion track. Despite this reasonable and clearly stated objective, there has been surprisingly little work on general-purpose algorithms that can accomplish this task. State-of-the-art dense motion trackers process the sequence incrementally in a frame-by-frame manner, and associate, by design, features that disappear and reappear in the video, with different tracks, thereby losing important information of the long-term motion signal. In this paper, we strive towards an algorithm for producing generic long-range motion trajectories that are robust to occlusion, deformation and camera motion. We leverage accurate local (short-range) trajectories produced by current motion tracking methods and use them as an initial estimate for a global (long-range) solution. Our algorithm re-correlates the short trajectories and links them to form a long-range motion representation by formulating a combinatorial assignment problem that is defined and optimized globally over the entire sequence. This allows to correlate features in arbitrarily distinct parts of the sequence, as well as handle tracking ambiguities by spatiotemporal regularization. We report the results of the algorithm on both synthetic and natural videos, and evaluate the long-range motion representation for action recognition.
first_indexed 2024-09-23T13:07:13Z
format Article
id mit-1721.1/100283
institution Massachusetts Institute of Technology
language en_US
last_indexed 2024-09-23T13:07:13Z
publishDate 2015
publisher British Machine Vision Association
record_format dspace
spelling mit-1721.1/1002832022-10-01T13:12:21Z Towards Longer Long-Range Motion Trajectories Rubinstein, Michael Liu, Ce Freeman, William T. Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Rubinstein, Michael Freeman, William T. Although dense, long-range, motion trajectories are a prominent representation of motion in videos, there is still no good solution for constructing dense motion tracks in a truly long-range fashion. Ideally, we would want every scene feature that appears in multiple, not necessarily contiguous, parts of the sequence to be associated with the same motion track. Despite this reasonable and clearly stated objective, there has been surprisingly little work on general-purpose algorithms that can accomplish this task. State-of-the-art dense motion trackers process the sequence incrementally in a frame-by-frame manner, and associate, by design, features that disappear and reappear in the video, with different tracks, thereby losing important information of the long-term motion signal. In this paper, we strive towards an algorithm for producing generic long-range motion trajectories that are robust to occlusion, deformation and camera motion. We leverage accurate local (short-range) trajectories produced by current motion tracking methods and use them as an initial estimate for a global (long-range) solution. Our algorithm re-correlates the short trajectories and links them to form a long-range motion representation by formulating a combinatorial assignment problem that is defined and optimized globally over the entire sequence. This allows to correlate features in arbitrarily distinct parts of the sequence, as well as handle tracking ambiguities by spatiotemporal regularization. We report the results of the algorithm on both synthetic and natural videos, and evaluate the long-range motion representation for action recognition. National Science Foundation (U.S.) (Grant CGV 1111415) NVIDIA Corporation (Fellowship) 2015-12-16T03:27:25Z 2015-12-16T03:27:25Z 2012-09 Article http://purl.org/eprint/type/ConferencePaper 1-901725-46-4 http://hdl.handle.net/1721.1/100283 Rubinstein, Michael, Ce Liu, and William T. Freeman. “Towards Longer Long-Range Motion Trajectories.” British Machine Vision Conference 2012 (2012). https://orcid.org/0000-0002-3707-3807 https://orcid.org/0000-0002-2231-7995 en_US http://dx.doi.org/10.5244/C.26.53 Proceedings of the British Machine Vision Conference 2012 Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use. application/pdf British Machine Vision Association MIT web domain
spellingShingle Rubinstein, Michael
Liu, Ce
Freeman, William T.
Towards Longer Long-Range Motion Trajectories
title Towards Longer Long-Range Motion Trajectories
title_full Towards Longer Long-Range Motion Trajectories
title_fullStr Towards Longer Long-Range Motion Trajectories
title_full_unstemmed Towards Longer Long-Range Motion Trajectories
title_short Towards Longer Long-Range Motion Trajectories
title_sort towards longer long range motion trajectories
url http://hdl.handle.net/1721.1/100283
https://orcid.org/0000-0002-3707-3807
https://orcid.org/0000-0002-2231-7995
work_keys_str_mv AT rubinsteinmichael towardslongerlongrangemotiontrajectories
AT liuce towardslongerlongrangemotiontrajectories
AT freemanwilliamt towardslongerlongrangemotiontrajectories