Skip to main content

Sketch

Sketch

The Sketch project is a proposed open source project under the Technology Container Project.

This proposal is in the Project Proposal Phase (as defined in the Eclipse Development Process) and is written to declare its intent and scope. We solicit additional participation and input from the Eclipse community. Please send all feedback to the Sketch forum.

Background

An example of the current development state of the API:

Overview of the API structure, the line divides its integration with GEF:

Overview of the API structure

Nowadays, with the increasing popularity of touch-enabled devices, it is getting more common to enable users to insert information directly on the screen, using their fingers or a pen, instead of using a mouse. This project aims to enrich user's interaction with GEF/GMF editors using the capabilities provided by touch/tablet devices.

Sketch provides an API to allow users to insert elements (EditParts) on GEF/GMF editors by capturing its gestures, typically using a pen on a tablet. The idea is to allow user might freely draw the representation of an element on the editor's area. The API would then transform this rough user's representation on something processable, interpreting what element the user meant to insert.

The API might learn from user's input, recognizing each sketch in the way it is drawn by different users. In cases when it is impossible to determine what element is meant by a sketch, the API 'asks' the user, through a small dialog, what he/she meant by that sketch. Thus the API is able to learn for the next times.

Scope

The project will provide an API and an basic exemplary editor for shapes such as Squares, Triangles and Circles. The API is targeted to developers who wish to add sketching capabilities to GEF/GMF editors, doing so through a small set of configurations, in which is possible to specify, among other things, the elements that should be recognized. The exemplary editor will show how to use the API and expose the provided features to an end user.

Description

Traditionally, GEF editors uses one tool for each element of the domain, thus the user must select a tool from the palette and click at the editor, adding a new element to the diagram. The goal of Sketch is to enable users to draw and connect the elements inside the editor's area, using their own gestures.

The Sketch API is targeted primarily at editors with a large set of elements, in which the user have to search through the whole GEF palette in order to add them to the editor.

Initial Contribution

This API is currently composed of three parts: a sketch Tool, a Recognizer and a sketch Bank (or database). The figure below shows how the API integrates with GEF, through AbstractTool and GraphicalEditor. The upper part of the figure is an example of integration -- it shows a GEF editor (ShapesDiagramEditor) and three tools (for Square, Triangle and a connection). The integration is made through the SketchTool -- a tool that observes the user's clicks and drags and provides a visual feedback of what is being drawn on the editor. Once the sketch is finished, the tool also passes the points to the SketchManager, which in turn processes those points and updates the editor.

Basically, the recognition is made using a chain of algorithms to take part in a decision. The SketchManager gives a list of points captured by the SketchTool to SketchElementRecognizer, and then periodically probes the recognizer for a result. The SketchElementRecognizer creates the chain using a set of algorithms. If the first algorithm of the chain is not 'sure' about what the sketch means, it can pass the decision to the next one on the chain, returning an element or null, if no element could be determined.

The first algorithm on the current chain was made using an approach based on Levenshtein's algorithm for string distance. It receives the user's sketch transformed in a set of points and places them on a grid, then each point is transformed to a number according to the next point's position. This approach is described in detail here.

For each element there is a set of stored '''words''' that describes its many forms, like drawn by the user. The recognizer compares each new sketch with the words of each element, calculating the distance between them. If the distance is smaller for a square than for a triangle, for example, then the element is probably a square.

The Chain of Responsibility pattern is applied to allow other algorithms to take part on the decision regarding what element is meant by a given sketch. Currently if a sketch is not recognized by the LevenshteinHandler, it is passed to ConnectionHandler, which analyses if the user is trying to connect two elements.

Committers

The following individuals are proposed as initial committers to the project:

  • Ugo Sangiorgi (Project Lead)
  • Mariot Chauvin, Obeo

Mentors

The following Architecture Council members will mentor this project:

  • Chris Aniszczyk
  • Ed Merks

Interested Parties

The following individuals, organisations, companies and projects have expressed interest in this project:

  • Chris Aniszczyk
  • Ed Merks
  • Ugo Sangiorgi
  • Obeo (http://www.obeo.fr)
  • Aurélien Pupier (BonitaSoft)
  • Mickael Istria (BonitaSoft)
  • Oisín Hurley
  • David Sciamma (Sierra Wireless)
  • Jens von Pilgrim (GEF3D)
  • Kristian Duske (GEF3D)

Project Scheduling

An initial build should be ready within a month passed the project approval, together with a working example application.

Changes to this Document

Date Change
02-02-2010 Document created

Back to the top