|(no subject) [message #674048 is a reply to message #674027]
||Wed, 25 May 2011 16:21
| Henrik Lindberg
Registered: July 2009
Agree with Alex - as I also don't think it is a good idea in general, |
but I can think of languages that are supposed to be "natural language
like" and where the keywords throws the author off when not in their
natural language. (OTOH, the order of words are different in different
natual language, and just changing the keywords is IMO worse than
restricting the natural language to only feel natural in a single language).
If you want to try to implement this,
there are several options I can think of (with different pro's and cons).
1. NL is part of one single language
1.1 Using multiple keywords
You can simply use multiple keywords in your grammar e.g.:
Entity : EntityKw name=ID ... ;
EntityKw : "entity" | "einheit" | "entidade" ;
Now users can enter one of them. Serialization will by default pick
"entity" (as it is first) if you go from model to text (which btw
happens if you make semantic quick fixes).
You need to do something about scoping so you only propose keywords in
the users NL (the simple implementation in the example will list all
three keywords). If you do not want to rely on order, you may need to do
EntityKw : EntityKw_EN | EntityKw_DE | EntityKw_PL ;
EntityKw_EN : "entity" ;
You are probably in for a rough ride with issues popping up in almost
every feature of Xtext...
1.2 Use different lexer
By using an external lexer, you can make the lexer NL driven. It will
then return KEWORD_ENTITY for the NL input.
The problem now is that the grammar is used to generate a bunch of
additional things, and it does not know that the keywords are configurable.
You will need to investigate if it is possible to inject some kind of
2. Different NL versions are different languages
As Alex suggests, you could define multiple grammars against the same
model and differentiate them on file extension - i.e. "mydsl_en",
You can probably structure this with a main grammar and one per
language, but the issue is that the main grammar needs to reference
different language grammars. As the reference is inside the grammar it
may pose some problems to process this - you need to figure out how
(some variability in how the reference to included (language) grammar is
You will need to work on where/how things are generated, and the plugin
metadata (to get all your languages exposed).
Depending on how you deal with cross references, you may need to come up
with a mechanism to refer to a resource without having to specify the
language it is written in - i.e. is foo.mydsl_en the same logical unit
3. Give up
Users are probably better of learning one language, and probably better
served if you spend your time on more valuable features.
On 5/25/11 4:59 PM, Alexander Nittka wrote:
> there is no framework support for that and I assume that it quite some
> effort to implement that.
> Personally, I don't even think it's a good idea.
> Of course you can implement several DSLs working on the same meta model.
|Re: (no subject) [message #674325 is a reply to message #674048]
||Thu, 26 May 2011 14:07
| Cristiano Gaviao
Registered: July 2009
Hi Alex and Henrik,|
Thanks for the answers. (strangely, I was not notified about the replies)
Well, my idea was to use Xtext to create an eclipse plugin to help us to write requirements using BDD (behavior driven development), that is a 'natural language like' task. So, after writing and validating the text then we could generate (and sync) the java test code(steps) for use it with Jbehave.
Jbehave is already a multi-language acceptance-test framework, where I can localize the keywords used (Given, When, Then, And) and use my own language for writing the stories and scenarios. Today I use a simple text editor to write them and the files extensions can be also localized.
So makes sense that the plugin created for it be localizable too.
I will study a little more Xtext and experiment the alternatives that you have exposed...
Powered by FUDForum
. Page generated in 0.11308 seconds