Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [henshin-user] Nested application conditions in Henshin diagrams?

Thanks, Matthias. I have implemented this for my metamodel and it seems to work. The only strange thing at the moment is that the rule seems to remove all tokens from all input places rather than just one token per place. Clearly, I have a misunderstanding of multi-rules here. Can you help?

BTW, how does one get the textual syntax?

Many thanks,


Dr. rer. nat. Steffen Zschaler AHEA
Senior Lecturer

King's College London
Department of Informatics

Email szschaler@xxxxxxx
Phone +44 (020) 7848 1513

-----Original Message-----
From: henshin-user-bounces@xxxxxxxxxxx [mailto:henshin-user-bounces@xxxxxxxxxxx] On Behalf Of Matthias Tichy
Sent: 20 October 2018 08:53
To: henshin-user@xxxxxxxxxxx
Subject: Re: [henshin-user] Nested application conditions in Henshin diagrams?


> I'm trying to figure out how to do nested application conditions in 
> the Henshin diagram editor. I was hoping that this could be done the 
> same way that I can nest multi-rules, using paths in the labels. 
> However, that doesn't seem to do anything useful. Is there a different 
> way of doing this?
> Specifically, I'm trying to write a standard Petri Net firing rule. 
> For this, I would like to add a NAC that there are no source places 
> with no tokens. So, roughly,
> ¬\exists({Transition-Place} à ¬\exists{Transition-Place-Token})
> I've tried using a require* instead, as below, but that rule never 
> matches, even when all source places have a token.
> Any suggestions?

i think it is not possible with the graphical editor, so either use the tree (which i did for these use case, see attachement) or the textual editor.



Prof. Dr. Matthias Tichy
Institute of Software Engineering and Programming Languages Faculty of Engineering, Computer Science and Psychology Ulm University
89069 Ulm, Germany

Tel.:  +49 731 50-24160
Fax:   +49 731 50-24162
email: matthias.tichy@xxxxxxxxxx

Back to the top