My review and impression of memoQ 2014

memoQ logo

Readers of this blog and those who might have stumbled upon some of my occasional posts on the subject in social media know that translation supporting technology and workflows are areas I take a special interest in. I’m always grateful for opportunities to learn about new and different approaches and tools.

In recent years memoQ has gradually risen to be one of the major commercial TEnTs (Translation Environment Tools) available on the market. Previously I only had very brief and superficial experience with older versions of memoQ, so when memoQ 2014 was released on June 2014, I was excited to take the opportunity and test it for a few months as my main production environment.

Being a relatively experienced user of Studio 2014, my original plan was to point out the similarities and differences in features and approach between the two TEnTs, but Emma Goldsmith had a similar idea and compiled an exhaustive side-by-side comparison of memoQ 2014 and SDL Trados 2014, doing a much better job than I ever could. Her comparison is a compulsory reading for anyone interested to learn about the similarities and differences between memoQ 2014 and SDL Studio 2014.

Instead, I decided to do a more general review of memoQ, starting with some of the major new features in memoQ 2014, continuing with some general features and approaches I like in memoQ, and concluding with my brief impression of memoQ 2014 after about three months of using it as my main production environment.

Note and Clarification

memoQ, SDL Studio, and most other TEnTs share the same basic concepts and workflow, which will not be covered in this post. I’m also not trying to determine whether memoQ is better or not compared to SDL Studio or any other TEnT, not least because I don’t believe in such a universal determination. The definition of ‘better’ always depends on personal preference, needs, and circumstances.

Continue reading →

What can we learn from Facebook’s Emotional Contagion Experiment?

Can emotional states be manipulated just by the use of emotion-evoking communication? This was the question that Facebook set to answer in a study that was published in the March 2014 issue of Proceedings of the National Academy of Sciences (PNAS) and erupted quite a controversy. The paper, named "Experimental evidence of massive-scale emotional contagion through social networks" (PDF file) describes a psychological experiment that Facebook had conducted during one week in January 2012 among 689,003 of its users to test how the amount of negative or positive content to which the users are exposed to affects their emotional state (i.e. mood), and their resulting behavior on Facebook.

In this blog article I don’t intend to discuss the controversy surrounding this research (here is the official reply of one of the researches to the controversy). This is and important topic that deserves discussion, but it outside the scope of this particular article.

Instead, I want to examine the results from a professional translation practitioner’s angle and suggest two lessons that I think we can learn from the study about the role of language in effective communication and social atmosphere setting.

Continue reading →

The Glossary Addons: (almost) The Missing Terminology Link

The main windows of the Glossary Converter app avilable for SDL Studio

One of the major complaints of many SDL Studio users is the lack of a "simple" terminology management module. MultiTerm, Studio’s terminology management environment, is a full-fledged terminology management module, but not everyone needs all the functionality it offers and some would have preferred a more simple and straightforward workflow that better suits their more modest terminology management needs.

Furthermore, MultiTerm’s traditional workflow can be a little daunting at first. Creating, importing, or exporting a termbase (the file storing all the terminology information) are not necessarily as most intuitive and straightforward as some would have preferred. And if that wasn’t enough, while Studio’s Editor environment enables the user to add and edit terms in a termbase that was added to a project, the termbase itself must first be created in MultiTerm, which is and external software package to SDL Studio. This adds another layer of confusion about how SDL Studio and MultiTerm interact with one another.

In this article I want to discuss how two Studio 2014 Open Exchange apps (for accuracy, one of them is a actually a plugin) simplify the terminology preparation and exchange workflows, and why I think their significance to the Studio ecosystem is larger than the sum of their parts.

Continue reading →

TEnTs Interoperability (or how to setup a WordFast or MemoQ Project in Studio 2014)

Interoperability is a topic I took a special interest in since starting to use Translation Environment Tools (TEnTs). Most TEnTs store the data in proprietary file formats and that makes it that much harder to share or migrate information. One unfortunate results of this difficulty is the enablement of some unethical practices, and even more importantly, the creation of the feeling among users that they are held “captive” by the propriety formats and forced to use a certain tool over another regardless of their workflow needs or preferences, unless they are willing to spend time and effort applying workarounds that are almost never guaranteed to work, or worse, invest money in tools just for using their filters in the pre-processing stage. This resonates hard with me because I’m strongly against what I believe is a harmful, damaging, misleading, delusional, and near-sighted infatuation with technology that puts the technology before the human professional. I believe that the human professional is the most important element in any professional activity and that the technology is just there to help the professional as a tool. Therefore, the professional must be able to choose his or her tools by merit, experience, and expertise with as little as possible artificial obstacles influencing the decision.

In recent years quite a few advancements have been made in terms of TEnTs interoperability. It was probably promoted by the increased range of available TEnTs in the market and the emphasize that some developers have put into better standards supports and interoperability from the get go. Nowadays most modern Translation Environment Tools can exchange information via standardized file formats – primarily XLIFF (for bilingual files) and TMX (for exchanging Translation Memory information) – and some of them even offer native or extendable (via add-ons) support for reading and writing proprietary formats of other TEnTs.

In that regard it is worth noting that contrary to common belief, irritating as they sometimes are, proprietary file formats are not used just to restrict users; they allow the developers to extended the functionality of the standard file formats and add features that users need, rely on, and come to expect.
It is not the ideal situation, and there is still a long way to go in terms of improved (dare I say complete?) interoperability, but we have come a long way since just even 5 years ago.

For example, MemoQ can natively handle Studio SDLXLIFF and SDLPPX (Studio Package file format), as well as WordFast Professional TXML files; OmegaT through the Okapi Filters Plug-in can be extended to support additional file types; SDL Studio file support can be extended by installing additional File Type Definitions from the Open Exchange platform; and other TEnTs such as Fluency and Déjà Vu also offer some degree of interoperability, but I don’t have enough experience with them to comment in detail. Since XML has become the de-facto format for storing and exchanging information, the modern TEnTs can create customized XML file definitions to parse virtually any XML-based file, even when no native or extendable interoperability exist. And to complement this improved interoperability and extendability, the information can also be exchanged via the standardized file formats.
The interoperability is not flawless, and exchanging information still not always as smooth as it should be, but we have come a long way, indeed.

A couple of days ago I helped a colleague setup a WordFast project in Studio 2014 and thought to share the experience as a short case study that highlights the process and the basic approach. This process can be used to add support for MemoQ MQXLIFF files, as well as any other file type available through the SDL Open Exchange platform.

Continue reading →

Tackling Studio 2014 Terminology Editing Slowness

The release of Studio 2014 brought about some great new features and performance gain that to me finally make Studio a mature Translation Environment Tool, but not everything went completely smooth as soon afterwards along came the now infamous Java Update 45 that broke pretty much the entire terminology editing functionality in Studio and MultiTerm 2014. With the recent release of Cumulative Update 2 this incompatibility between Studio and MultiTerm 2014 and the latest Java version has been resolved, at least for the most part, but another nuisance that loomed over Studio 2014 since its release remains unchanged: the controversial delay when trying to add or edit terms in a termsbase.
This delay is quite the source of controversy among users, with some even claiming that it is a productivity showstopper that prevents them from switching to Studio 2014 or forces them to revert to a previous version of Studio. I sympathize with their sentiment and can understand how these short delays can add up with time, especially for those who do a lot of terminology editing.
In an attempt to better understand what causing this delay and to find out if it could be avoided, I decided to investigate (within the scope of my limited knowledge on the subject) further into this issue, and came up with what I consider to be a satisfactory workaround for most users.
Continue reading →