Home » Archived » Sketch » Relationship between sketch tool and existing edit parts
Relationship between sketch tool and existing edit parts [message #515879] |
Mon, 22 February 2010 00:19  |
Hallvard Traetteberg Messages: 673 Registered: July 2009 Location: Trondheim, Norway |
Senior Member |
|
|
First of all, I am interested in this project, as I've worked on something similar in the past!
In the video, it is shown how new edit parts can be created by means of sketching. Do you intend to support issuing requests on existing edit parts, too? This is more like gesturing, e.g. zigzag or crross on an edit part (or rather its figure) to delete it, rather than sketching the shape of a symbol.
In my own work, gestures were matched against pre-defined polylines, and then translated into requests that were handled by the existing edit policy mechanism. The type and target of the request was determined by rules, e.g. a Z gesture inside an X edit part became a deletion request. The rules were encoded as XML in an extension.
A nice trick was the behavior if no rules matched. First it located the edit parts underneath the start and end points and then tried to connect them using a specified (and existing) connection tool. If this failed, all the points that the gesture consisted of were used to drive the selection tool, so selecting, moving and resizing could be performed without leaving the gesture tool.
|
|
|
Re: Relationship between sketch tool and existing edit parts [message #516172 is a reply to message #515879] |
Tue, 23 February 2010 00:03   |
|
Hi Hallvard,
Yes, we're definitely planning to support gestures, Jens von Pilgrim and Kristian Duske
from GEF3D showed interest on that feature as well, I think its an essential feature to have it on the API.
The far i got on that was to be able to translate a touch like a click to direct-edit, on a editpart.
Is interesting how you did, I would like to see it. The current implementation match the gestures with polylines
translated to sequences of words (serialized with Properties..), so they can be matched using levenshtein algorithm.
I think the rest is the same, pretty much.
Anyway i think its crucial for this API to have a flexible mechanism in a fashion that many algorithms could take place
at the recognition, deciding what is going on (creation, gesture, connection, and so on).
Would you like to contribute to the project, once its created?
Best,
Ugo
Hallvard Traetteberg escreveu:
> First of all, I am interested in this project, as I've worked on
> something similar in the past!
>
> In the video, it is shown how new edit parts can be created by means of
> sketching. Do you intend to support issuing requests on existing edit
> parts, too? This is more like gesturing, e.g. zigzag or crross on an
> edit part (or rather its figure) to delete it, rather than sketching the
> shape of a symbol.
> In my own work, gestures were matched against pre-defined polylines, and
> then translated into requests that were handled by the existing edit
> policy mechanism. The type and target of the request was determined by
> rules, e.g. a Z gesture inside an X edit part became a deletion request.
> The rules were encoded as XML in an extension.
>
> A nice trick was the behavior if no rules matched. First it located the
> edit parts underneath the start and end points and then tried to connect
> them using a specified (and existing) connection tool. If this failed,
> all the points that the gesture consisted of were used to drive the
> selection tool, so selecting, moving and resizing could be performed
> without leaving the gesture tool.
|
|
| | |
Goto Forum:
Current Time: Tue Sep 26 00:32:48 GMT 2023
Powered by FUDForum. Page generated in 0.01983 seconds
|