Industry 4.0 – Internet 4.0: Translation 4.0?

Changes in research paradigms, the Linguistic (Re)Turn and interweaving approaches in T&I research

After decades in which linguistic approaches to translation were ignored, the last few years have heard more and more voices calling for linguistic questions, models and approaches to be given greater prominence (see e.g. House 2013 or Sinner/Hernández/Hernández 2014 for references).

Similarly, the widely-observed split between cognitive or psycholinguistic viewpoints in translatology on the one hand and more sociological methods on the other is now being gradually replaced by interdisciplinary and multi-faceted approaches. Among these new tendencies to interconnect different spheres are discussions on the relationship between humans and computers, the resulting change in the role of the translator and changing degree of involvement of stakeholders in the translation process. Of particular interest here are the constantly developing technological possibilities (CAT tools, improved dictation and speech-recognition software etc.), which are lastingly changing translators’ working lives and are forcing them to adapt and develop as never before. Ever greater diversification in opportunities (or constraints, depending on the point of view) means that the profile of translators and interpreters is also becoming ever more varied. This inevitably has an effect on universities’ abilities to incorporate new expertise and working methods into their T&I programmes, as the expansion in technology has not been paralleled by an expansion in course duration (which has, in fact, been reduced). The result is that only a few selected technologies or processes can be investigated, and the others must fall by the wayside.

Increasing interest in process-oriented approaches in translation is reflected in questions regarding aspects such as the interface between the cognitive and situational levels, translation as a dynamic system, cooperation between translators and tool and/or software developers, the role of economic factors, revision as part of the workflow process, or the effects of new tools and technologies on productivity.

Audiovisual Translation

Audiovisual translation – including dubbing, subtitling, voiceover, audio-description – is an arena which is constantly and rapidly expanding as new technologies come on-stream, and this permits increased tool personalisation and specialisation, giving rise to questions regarding changes in the translation process and the role of the translator, opportunities for integrating praxis and training, and new challenges on the horizon. As in interpreting studies (see below), there is a heightened awareness of non-professional (but nonetheless often high-quality) translation which is becoming increasingly important in certain fields (e.g. current developments in new media especially, e.g. crowd translation or fan translation, and the dynamic intersection between translation and adaptation). The cultural, social and technological changes witnessed in recent years, particularly as part of globalisation and digitalisation, have brought with them new requirements, and these need to be analysed in terms of their consequences for translation. Particular attention will be paid to questions relating to the type and role of audiovisual translation in pluricentric languages (such as multiple dubbing or subtitling of films in different French or Spanish-speaking regions), accessibility (audio-description), the influence of technological innovation, and dealing with simulated orality. Also relevant in this context are overlaps with studies on intersemiotic translation or adaptation.

Globalisation, localisation, post-editing, machine translation and corpus-based translation studies

Localisation has been growing in volume for many years and has, in conjunction with globalisation or internationalisation, had deep impacts on the translation industry and subsequently T&I research. Only recently have stakeholders begun to examine the technological and socio-economic contexts of localisation, and started analysing how and to what degree the digital revolution is shaping the profile of translation as a profession, and will thus transform T&I training. Among the aspects needing to be examined by quality research are problems such as infringements of norms which are noticeable for users but are caused when translators lose control of the translation due to constraints imposed during the software programming process (see Behrens 2016). Closely connected to localisation is the issue of machine translation and post-editing, which is of increasing importance in the translation profession. Machine translation and online translation resources are constantly expanding and improving, and in some areas are already competing much more fiercely with or are more tightly incorporated into human translation than would have been thought possible only a few years ago. There is an urgent need for research in the field, particularly with regard to changed work processes and the impact of the revolution in translation practice on the creation of translations and the perception of them by their users.

Also closely connected to the issue of machine translation and localisation is corpus-based research into translations as products. As well as including source and target texts, corpora can also include various intermediary stages of correction and revision which lead to the final product, which has always been the subject of examination by translation researchers. Analysis of these rough versions, translations-in-work, gist translations etc. is key to understanding the elaboration process, the role of the individual translator and the process steps that are not immediately connected with the translation. In consequence, this concept also incorporates process-oriented research and approaches derived from action theory. The concept also opens up the perspective of analysing the visibility or presence of the translator by means of mining big data.

Current trends in interpreting research; between cognitive linguistics and social sciences

There is also in interpreting research a tradition of separation into cognitive and psycholinguistic schools of thought on the one hand and more sociological approaches on the other, and it is only recently that the schism has begun to close. Cognitive and psycholinguistic research, with its stronger focus on mental processes, has tended to concentrate more on classically “professional” interpreting, in particular conference or simultaneous interpreting, while sociologically-based interpreting studies have investigated aspects such as involvement in the interaction (Handlungseinbindung), participant constellations (Beteiligtenkonstellationen), situation determination, ethical questions, and inclusion or exclusion. In the process, researchers have tended to focus on what is variously called community interpreting, public service interpreting or mediation, which is not recognised as professional interpreting in many countries.

The development of these two movements and various tendencies current within them will be given space at LICTRA. The sessions will aim to contribute to providing an overview of major research and to bringing together different strains of investigation. However, they will also attempt to counterbalance the current spate of publications on the “fashionable” research topics of setting and interaction analysis by spotlighting other new and innovative interpreting studies in areas such as brain research, the relatively under-researched question of the impact of new technologies on interpreters, or current tendencies in interpreter training, especially training-the-trainers.

Cognitive processes in the interpreting process have long been a topic for interpreting studies. However, the majority of research in the field has been of a rather intuitive nature, there being only relatively few experimental studies using technologies such as EEG, fMRI or eye-tracking to confirm or rebut theories, or to generate new insights. Such studies as there are open up interesting crosslinks and tie-ins which have not yet been adequately researched, such as the divergence between performance and brain processes when interpreting into the native (A) language, into the foreign (B) language, or between two foreign (B) languages.

Corpus-based interpreting research

Special attention will be given to corpus-based interpreting research. Twenty years after its first use in linguistics, corpus-based research using spoken language is much less developed than research on written language. Clearly this has to do with the greater demands on time and resources in order to collect and transcribe data, but in interpreting research there is the additional challenge of the legal difficulties surrounding collecting authentic data (see Bendazzoli/Sandrelli 2009, Pöchhacker 2008). For this reason, cooperation between practising interpreters who are also active in research (so-called practisearchers) is of crucial importance (see Bendazzoli/Sandrelli 2009).

Sign language interpreting

Sign language interpreting is still generally excluded from T&I studies. However, in addition to areas that have a degree of overlap with “classical” interpreting studies, research in sign language interpreting is also currently investigating ethical questions like inclusion and exclusion, and the impact of digitalisation on the domains in which sign language interpreting is used, as technical innovations and digitalisation are constantly changing the environment for the hard of hearing.

Meaning relations in source and target text

Translation studies have always been interested in role of meanings in source and target texts, and have always examined retention or loss of meaning during the translation process. Researchers have always come back to questions of whether meanings and meaning relations can be “translated”, “reflected”, “portrayed”, “newly constructed”, and whether this happens “correctly”, “comprehensively”, “fully” etc.; whether denotative, connotative, pragmatic meanings are (to be) prioritised; and whether semantic loss can be avoided and how it can be or is compensated. Further aspects that are closely related to this are the evaluation of semantic relations between source and target text, for example with regard to possible loss of quality in the target text, and the point of view of the translation in relation to the original (“copy”, “re-writing”, “creation” etc.), a topic that is currently the subject of much debate in hermeneutic and deconstructivist studies. There will also be discussion of the effect of translation on the reception or perception of the original  (see Szlamowicz 2011 and Mossop 1983 for a historical perspective). The breakneck speed of development in the digital arena, continually expanding possibilities in corpus linguistics, new insights into the constitution of meaning and the role of cognition, new discourse analysis approaches and finally the awareness that perception must play a greater role in T&I studies (see Sinner/Morales 2015) all mean that “classical” points of view must be revisited (see Cronin 2012), approaches modified, and established methods such as the increasingly frequently used eye tracking (see also 4 below) must be adapted to take into account changing assumptions.

Phraseology in technical translation; phraseology and variation

Phraseology is breaking out of the realms of lexicology and lexicography, and is establishing itself as an important subdomain of T&I research. Translatology research is also being increasingly recognised as relevant to LSP linguistics, technical translation and technical interpreting.  Thanks to the possibilities for research and analysis opened up by technical developments,  phraseology is also emerging as a promising area in fields that are extremely important from a T&I point of view, such as international law. Increasing international integration, whether it is on one continent or truly global, is having significant consequences for technical terminology, as world trade, international law and global cultural and ideological tendencies are promoting the standardisation of terms, or at least are motivating attempts to be made to simplify communication by aligning and adapting terms. In many areas, much of this is taking place above the level of simple lexemes: instead it involves functional verb constructions or collocations and, especially, phraseologisms, as can be seen in discussions about variation in language in legal communication, translation or terminological aspects in pluricentric languages (see Sinner 2014: 271-272, Tabares/Ivanova 2009).

last modified: 14.10.2016