We couldn't load all Actvitity tabs. Refresh the page to try again.
If the problem persists, contact your Jira admin.
Uploaded image for project: 'translate5'
  1. translate5
  2. TRANSLATE-3518

Infrastructure for using "translate5 language resources" as training resources for MT

Details

    • High
    • Hide
      Cross Language Resource synchronisation mechanism and abstraction layer introduced into application.
      From now on we have mechanic to connect different Language Resource types (like t5memory, Term Collection, etc) for data synchronisation if it is possible
      Show
      Cross Language Resource synchronisation mechanism and abstraction layer introduced into application. From now on we have mechanic to connect different Language Resource types (like t5memory, Term Collection, etc) for data synchronisation if it is possible

    Description

      PLEASE: Good infra structure base, but no over-engineering. Only what we see, that we definitely need as a base - e. g. to cover current GPT and DeepL use cases together.

      The scenario we have with the linked issue TRANSLATE-3498 will in the future happen for many other MT language resources: That we use TM and/or TermCollections in translate5 as automaticaly provided training data for the MT resources.

      Therefore for TermCollections the triggers pointed out in TRANSLATE-3498 should be implemented in a generic way, so that it is easy to reuse them for other MT resources like GPT.

      The way the administration of MT resources connected to TermCollections or TMs should work in the same way in GUI and back–end.

      Also the trigger points of changes are the same for TermCollections.

      Yet, the way of training might be different in different MT resources:

      • For DeepL it is necessary to delete and recreate a glossary with each change
      • For GPT simply newly created terminology should be added/GPT should be informed about deletion of deleted or changed terms
      • For GPT TM training should work analogous to terminology training (see above bullet)

      Trigger points for TMs to update an MT resource

      Same as for TermCollections except those related to editing terms in TermPortal.

      In addition:

      • Saving/updating a segment in translate5 editor
      • Creating/updating/deleting source and/or target of translation units in TM maintenance

      Attachments

        Issue Links

          Activity

            Loading...
            Uploaded image for project: 'translate5'
            1. translate5
            2. TRANSLATE-3518

            Infrastructure for using "translate5 language resources" as training resources for MT

            Details

              • High
              • Hide
                Cross Language Resource synchronisation mechanism and abstraction layer introduced into application.
                From now on we have mechanic to connect different Language Resource types (like t5memory, Term Collection, etc) for data synchronisation if it is possible
                Show
                Cross Language Resource synchronisation mechanism and abstraction layer introduced into application. From now on we have mechanic to connect different Language Resource types (like t5memory, Term Collection, etc) for data synchronisation if it is possible

              Description

                PLEASE: Good infra structure base, but no over-engineering. Only what we see, that we definitely need as a base - e. g. to cover current GPT and DeepL use cases together.

                The scenario we have with the linked issue TRANSLATE-3498 will in the future happen for many other MT language resources: That we use TM and/or TermCollections in translate5 as automaticaly provided training data for the MT resources.

                Therefore for TermCollections the triggers pointed out in TRANSLATE-3498 should be implemented in a generic way, so that it is easy to reuse them for other MT resources like GPT.

                The way the administration of MT resources connected to TermCollections or TMs should work in the same way in GUI and back–end.

                Also the trigger points of changes are the same for TermCollections.

                Yet, the way of training might be different in different MT resources:

                • For DeepL it is necessary to delete and recreate a glossary with each change
                • For GPT simply newly created terminology should be added/GPT should be informed about deletion of deleted or changed terms
                • For GPT TM training should work analogous to terminology training (see above bullet)

                Trigger points for TMs to update an MT resource

                Same as for TermCollections except those related to editing terms in TermPortal.

                In addition:

                • Saving/updating a segment in translate5 editor
                • Creating/updating/deleting source and/or target of translation units in TM maintenance

                Attachments

                  Issue Links

                    Activity

                      People

                        sanya@mittagqi.com Sanya Mikhliaiev
                        marcmittag Marc Mittag [Administrator]
                        Axel Becher, Thomas Lauria
                        Votes:
                        0 Vote for this issue
                        Watchers:
                        1 Start watching this issue

                        Dates

                          Created:
                          Updated:
                          Resolved:

                          People

                            sanya@mittagqi.com Sanya Mikhliaiev
                            marcmittag Marc Mittag [Administrator]
                            Axel Becher, Thomas Lauria
                            Votes:
                            0 Vote for this issue
                            Watchers:
                            1 Start watching this issue

                            Dates

                              Created:
                              Updated:
                              Resolved: