The translation between the models of two bounded context requires a common language.
* * *
Direct translation to and from the existing domain models may not be a good solution. Those models may be overly complex or poorly factored. They are probably undocumented. If one is used as a data interchange language, it essentially becomes frozen and cannot respond to new development needs.
Therefore:
Use a well-documented shared language that can express the necessary domain information as a common medium of communication, translating as necessary into and out of that language.
How do you focus on your central problem and keep from drowning in a sea of side issues?
<i>Distillation</i> is the process of separating the components of a mixture to extract the essence in a form that makes it more valuable and useful. A model is a distillation of knowledge. With every refactoring to deeper insight, we abstract some crucial aspect of domain knowledge and priorities. Now, stepping back for a strategic view, this chapter looks at ways to distinguish broad swaths of the model and distill the domain model as a whole.