Towards handshape identification for automatic gesture recognition using sign notation systems Towards handshape identification for automatic ...
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
Philipp Achenbach, Yasmin Göksu, Timo Kullmann, Thomas Tregel, Stefan Göbel. Towards handshape identification for automatic gesture recognition using sign notation systems. In Proceedings of the 8th European Conference on Social Media (ECSM ’21), July 2021, ISSN: 2055-7213 Towards handshape identification for automatic gesture recognition using sign notation systems Towards handshape identification for automatic gesture recognition using sign notation systems Philipp Achenbach, Yasmin Göksu, Timo Kullmann, Thomas Tregel, Stefan Göbel Technical University of Darmstadt, Germany philipp.achenbach@kom.tu-darmstadt.de yasemin.goeksu@stud.tu-darmstadt.de timo.kullmann@stud.tu-darmstadt.de thomas.tregel@kom.tu-darmstadt.de stefan.goebel@kom.tu-darmstadt.de Abstract: Today, about 72 million people worldwide are speaking sign language. Since many deaf people are also dumb, they cannot communicate with hearing people through spoken language, even if they can lip-read. But sign language is difficult to learn, and more than 300 different sign languages in the world make things even more challenging. Therefore, to support the learning of sign language, we want to develop a gamified learning app for sign language that includes automatic sign recognition. The application should provide constructive feedback to the user about the quality of the executed sign. Each sign could be parameterised in terms of its characteristic handshape and its orientation and position: the more parameters are available, the more accurate and detailed feedback can be provided for the user. However, the parameters must also be distinguishable from a technical point of view. In linguistics, different notation systems exist to translate signs into written form. For this, the systems decompose signs into their characteristic properties. We want to utilise these notation systems to reduce signs to parameters that are easy to measure, e.g., the hand's shape, orientation, or position. Since the sign notation systems originate from different fields and have different backgrounds, they also differ in their objectives and thus in numbers and extents of parameters and respective features, further called symbols. Therefore, there are systems whose notations have just enough detail to identify the meant sign and those with so much detail that the reader can reproduce the sign. This higher number of details is reflected in a higher number of parameters and symbols. Hence, we present eleven sign notation systems, starting by examining the handshape as the most concise parameter of sign language. We compare it in the context of notation systems for its suitability for our gamified learning app for sign language. A clear differentiation of the handshapes needed for American Sign Language is essential for qualitative feedback for the user. At the same time, a small number of handshapes should reduce the technical effort required for reliable recognition. Keywords: handshape identification, gesture recognition, sign language, sign notation systems, sign learning app 1. Introduction 1.1 Motivation According to the World Health Organization (2021), by 2050, more than 700 million people will have disabling hearing loss. Even today, more than 5% of the world's population (430 million people) need rehabilitation services for their hearing loss (World Health Organization, 2021). 72 million of them are deaf and use sign language to communicate. Some of them are able to lip-read, but they can only communicate if their communication partner knows sign language. The situation is further complicated by the fact that there are about 300 different sign languages worldwide (United Nations News, 2019). In this context, it should be mentioned that only 2% of deaf people receive training in sign language. This deficit already starts in childhood, as 72% of families with deaf children do not communicate with them in sign language (Waterfield, 2019). Hence, because of this difficult communication, children with hearing loss or deafness often do not receive lessons in developing countries, as the World Health Organization (2021) reported recently. Affected adults are often unemployed or working at lower levels. This is harmful to those affected and results in an annual cost of $980 The documents distributed by this server have been provided by the contributing authors as a means to ensure timely dissemination of scholarly and technical work on a non-commercial basis. Copyright and all rights therein are maintained by the authors or by other copyright holders, not withstanding that they have offered their works here electronically. It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author’s copyright. These works may not be reposted without the explicit permission of the copyright holder.
billion worldwide, which are 57% attributed to low- and middle-income countries. This amount includes costs in the areas of health (excluding hearing aids), education, society, and productivity losses (World Health Organization, 2021). Thus, it is evident that deaf people have difficulties communicating with their environment and that there is a need to find more effective solutions for learning sign language. For this, it is necessary to track and recognise signs. These could be recognised as a whole, which would certainly work well for a feasibility study. However, to be used meaningfully in the context of a learning app, the vocabulary should be similar to a dictionary, so about 5,000 different signs would have to be trained. Because of this large number, it is necessary to break down the signs into their characteristic components. This is already done by notation systems for sign language, which transfer the signs into a written form. An analysis of these notational systems could therefore help examine the structure of signs. 1.2 Goal To provide better communication between the deaf and the hearing, we aim to develop a gamified learning app for sign language that includes automatic sign recognition. The learning app provides qualitative feedback about the executed sign: The user should perform a gesture that is recognised by the application. The application then reports whether the gesture task was performed correctly, further considering and differentiating aspects such as handshape or movement. This way, the user learns which elements of the sign were executed correctly or require improvement. This qualitative feedback is intended to facilitate the learning of sign language. For this, we investigate the parameters into which a gesture can be divided and the parameters' possible values (from now on called symbols) to identify gestures of American Sign Language (ASL). They should be chosen to be able to recognise all (one-handed) signs of ASL. We assume that recognising two-handed signs is feasible by combining the sign recognition procedure for each hand separately. For this parameterisation, we utilise sign notation systems as they use parameterisation to convert signs into a written form. The readability of a sign notation system increases when it uses more parameters. This results since many parameters indicate that we obtain many details of a sign from a written word in the notation system. On the other hand, many symbols also indicate that the notation system gives good feedback from a written sign to its corresponding sign. On the technical side, we want our recognition system to have a high precision by using a notation system using fewer symbols since classifying accuracy increases with fewer symbols. The challenge here is to find a notation system with enough parameters and symbols to accurately identify signs through the unique combination of parameters/symbols. At the same time, it should give qualitative feedback to the user but does not provide unnecessary detail that could make technical recognition and differentiation difficult. Therefore, in this paper, we want to introduce different sign notation systems and start by examining the handshape as the most concise parameter of sign language. 2. Background American Sign Language (ASL) and other sign languages play a huge role in the life of the deaf. According to SIL International (2021), there are about 459,850 native speakers of ASL, which according to various dictionaries, consists of up to 5,000 signs (Stokoe, Casterline and Croneberg, 1976; Sternberg and Sternberg, 1998; Tennant, Gluszak and Brown, 1998; Costello, 1999; Valli, 2006). So it can take multiple years to become a fluent speaker of sign language. To increase inclusion, we need to find a way to make sign language as accessible as possible. Hence, research of sign languages and their structures is needed. Until 1960 sign language was not recognised as a complete language because of missing linguistic research. Sign language was said to be less precise, flexible and subtle than spoken language. William C. Stokoe was then the first linguist who was able to prove that ASL uses a so-called phonological or sublexical structure. This means that signs in ASL get different meanings due to differentiable structural elements, as can be seen in Figure 1 (David F. Armstrong, Michael A. Karchmer and John Vickrey Van Cleve, 2002).
Figure 1: Meaning and effect of parameter changes using simple hand gestures as examples: (1) Victory gesture, (2) insult gesture (change of palm orientation), (3) swear gesture (change of handshape). (4) fingers as an embodiment of rabbit ears (change of position in gesture space) (Fricke and Bressem, 2020). For a better understanding, we define a parameter of a sign notation system to be a meaning altering characteristic of a sign, e.g. the handshape. A parameter has configurations (called symbols) that describe the state of the parameter. Every sign can be characterised by a combination of these symbols. Since every change in a symbol of a parameter alters the meaning of a sign, one state of a parameter describes a phoneme. A phoneme is "one of the smallest units of speech that make one word different from another word" (Cambridge University Press, no date). Examples for different phonemes are the vowels i in pin and a in pan. On the opposite, we define a symbol in a sign notation system to be a grapheme of the sign notation system. A grapheme is "the smallest unit in a system of writing a language that can express a difference in sound or meaning" (Cambridge University Press, no date). A phoneme can correspond to multiple graphemes. Examples of this are the sounds of ee in see and ey in key, which correspond to the same phoneme, but different graphemes (Adam Szczegielniak, 2013). 3. Notation Systems A sign notation system is a writeable set of characters that provides a readable presentation of signs. Sign notation systems use parameters as a way to categorise elements of a sign and make them differentiable. An example of a parameter in sign language is the sign's handshape. Symbols in a sign notation system represent a configuration of a parameter or a subset. In Figure 2, you can see several examples of different parameters and symbols of various sign notation systems. Figure 2: Example words with their respective representatives for the sign notation systems examined in this paper (ASL Font, 2013; 'Brief Comparison of ASL Writing Systems', 2021; Sample Words - ASLSJ, 2021; Sign Language IPA, 2021; Grushkin, 2017). Notation systems differ in which parameters are considered based on priorities, goals of the system and the current state of the science. Also, different symbol sets may be used by different notation systems. Different notation systems use different names for parameters and symbols, like aspects (Stokoe, Casterline and Croneberg, 1976) and graphemes (Supalla, McKee and Cripps, 2014). Table 1 gives an overview of the notation systems for ASL examined in this paper, starting with Stokoe 1960. There are several more sign notation systems like Sign Language Phonetic Annotation (1989) and Prosodic Model Handshape Coding (2008) that have not been further examined here (Hochgesang, 2014).
Table 1: Overview of sign notation systems examined in this paper. 1 actual number of handshapes may be increased by combinations with other parameters or features, 2 second number corresponds to the number of handshapes for ASL (for international notation systems). System Stokoe Notation SignWriting HamNoSys SignFont ASL-phabet ASLO Year 1960/1965 1974 1985 1987 The 1990s 1997 Parameters 3/4 5 5 5 3 6 Symbols 55/64 672 210 272 32 n.a. Handshapes1 39 255/832 200+/n.a.2 125+ 22 46+ Language ASL International International ASL ASL ASL Pictorial No Yes Yes Yes Yes No System Si5s/ASLwrite SLIPA ASLSJ SignScript ASLFont Year 2003/2011 2005 2009 2010 2013 Parameters 5 5 9 5 7 Symbols 105 n.a. n.a. 131 n.a. Handshapes1 65+ 54+/n.a.2 59+ 46 51 Language ASL International ASL ASL ASL Pictorial Yes No No Yes Yes 3.1 Stokoe (1960 / 1965) William C. Stokoe designed the Stokoe Notation System in its first iteration in 1960. The goal was to enable linguistic research of ASL. The first iteration began with 55 Symbols grouped into three parameters first mentioned in the book "A dictionary of American Sign Language on linguistic principles" by Stokoe et al. (Stokoe, 1960; Stokoe, Casterline and Croneberg, 1976). Stokoe created the system with three parameters and 55 symbols (Martin, 2000). Tab (Tabula) is the first parameter and describes the location of where the sign is performed. There are 12 symbols for Tab. Dez (Designator) is the second parameter, describing the handshape at the beginning of performing the sign. There are 19 symbols for Dez. Sig (Signation) is the third parameter and is used to describe the movement of the hand during the sign. There are 24 symbols for Sig. In 1965 Stokoe extended his notation system by introducing a fourth parameter describing the hand's orientation with nine additional symbols. Multiple symbols in the order mentioned above can be used to describe a single sign. Non-manual features of a sign, which are parts of the body language like mouth or eyebrow position or direction of view, are not represented. 3.2 SignWriting (1974) SignWriting is a notation system with its origin in DanceWriting developed by Valerie Sutton. The University of Copenhagen used DanceWriting, a notation system of pictorial symbols to document dance moves, as a base to create a notation system for research on human movements. After further development by Sutton in 1974, SignWriting was created with its specialisation in representing sign language (Sutton and Frost, 2008). The system can be used for writing down signs from an expressive or receptive viewpoint, where the expressive viewpoint is standard (Sutton, 2014). Focus is on an easy to read and very pictorial way of transcribing signs. SignWriting consists of five parameters with 672 symbols: movements, handshapes, locations, orientations, and non-manual properties (Martin, 2000). Movements can be divided into starting locations, movement actions and end locations. Symbols can be combined very freely to allow for all possible signs in all known sign languages. Also unique in SignWriting is a parallel representation, while in other notation systems, symbols are written down in a sequence, signs in SignWriting are written as a literal representation of a performed sign with symbols representing the real relative locations in space. Overall, SignWriting allows for near unlimited possible combinations and introduces huge complexity and ensures an easy-to-read notation system for human readers.
3.3 HamNoSys (1984) The Hamburg Sign Language Notation System (HamNoSys) was developed at the University of Hamburg in 1984. HamNoSys has its roots in the Stokoe Notation but is applicable for all known sign languages and not only ASL, therefore being an international system. In addition, HamNoSys aims to provide a compromise between complexity and ease of use. Complexity is increased using a formal syntax but also kept down by providing options to reduce notation length. On the other hand, a formal syntax, combined with iconicity, also improves ease of use and readability. This system also has great integration into computer systems and allows for adaption, extension, and further development for specific needs. HamNoSys consists of three parts to describe signs, which are five parameters and about 210 symbols. The starting location is the first part, including four of the five parameters. Those are the location, shape and orientation of the hand performing the sign. Additionally, non-manual features are described. The second part is performed actions. These are the fifth parameter, describing how the starting location is being changed. Actions can include internal movements of the hand or path movements. Multiple actions can be performed in sequence or parallel. Repetitions can be defined as well. The third part is symmetry which includes a description of how the second hand copies the actions of the dominant hand. Otherwise, actions can be defined for both hands separately. HamNoSys restricts the description of non-manual features of a sign to a limited number of symbols to keep complexity down, but each body part can be assigned with actions just like the hand to describe their movements if more details are needed. HamNoSys is available in Unicode (Hanke, 2004). 3.4 SignFont (1987) SignFont is a notation system for ASL created in 1987 by Don Newkirk. It has 272 symbols with five parameters. Those include handshape, contact region and location, definable for both hands, and movements. In addition, there is a parameter for non-manual movements. SignFont was used as a basis for developing the ASL-phabet. The SignFont homepage has unfortunately been offline since 2002 and is only accessible via the web archive (ASL Font, 2013; ScriptSource - SignFont Notation, 2021). 3.5 ASL-phabet (the 1990s) Samuel Supalla created ASL-phabet in the 1990s as a language for teaching Deaf children the principles of using an alphabet. The aim is to improve understanding of spoken languages and their alphabets (Supalla, McKee and Cripps, 2014). The most significant difference of ASL-phabet in comparison to other notation systems is the associative approach. Symbols do not have a pictorial meaning but are associated with specific meaning, which needs the context of other symbols around it in a sign. Therefore, it works like alphabets of spoken languages, as it does not provide explanations on how to perform a sign but rather create an association to the correct performance in the context of a group of symbols representing a sign. ASL-phabet consists of three parameters with 32 symbols. The symbols represent graphemes. The handshape is the first parameter with 22 symbols. Secondly, there is the location with five symbols: the forehead, mouth/chin, upper chest, stationary arm, and area in front of the body. The last parameter is movements with five symbols, three of those for the three axes in space, one for circular and one for internal movements. Signs can be represented with up to six symbols, but each parameter must be included with at least one symbol. Parameters appear in the order as listed above. There can be up to two handshapes, one location and up to three movement symbols (Supalla, McKee and Cripps, 2014). Because of the small number of symbols and the associative approach, there is a lot of ambiguity in ASL-phabet (Eiffert, 2012). Unfortunately, the homepage of ASL-phabet is offline, and the domain is for sale (Aslphabet.com is for sale, 2021). 3.6 ASL Orthography (1997) ASL Orthography (ASLO) is an unfinished idea for a notation system created by ASL student and computer- programmer Travis Low in 1997. The system was supposed to use ASCII characters and have six parameters. Those are location type, orientation, handshape, quality, location, and motion. The parameters are used in this order for both the dominant and the non-dominant hand (ASL Font, 2013). Unfortunately, the homepage of ASL Orthography is now an empty WordPress blog ('Dawnstar Blog – Just another WordPress site', 2018).
3.7 si5s (2003) / ASLwrite (2011) Si5s was developed in 2003 by Robert Arnold and presented to the public first in 2010 at the Deaf Nation World Expo in Las Vegas. The system aims to provide a way of having written representations of ASL. A year later, in 2011, because of differences in the opinion about how to continue the project, ASLwrite split off as the Open- Source alternative. Because of their origin, both notation systems have the same structure. They consist of five parameters and 105 symbols. Unfortunately, si5s is no longer available and also ASLWrite homepage is currently in maintenance mode (Si5s.org, 2021; aslwrite.com, no date). There are 67 symbols for handshapes, called digibet. The second parameter, diacritics, describes internal movements of the hand. External movements are summarised in the third parameter. The fourth parameter is the location where the sign is performed. The last parameter is used to describe non-manual features of a sign with 16 different symbols. The order of the parameters is not strictly defined, and not every parameter is needed in each representation of a sign (Clark, 2012). Since ASLwrite is an Open-Source project, it has the potential for modifications for different needs. 3.8 Sign Language IPA (2005) Sign Language IPA (SLIPA) was developed in 2005 by linguist David J. Peterson. SLIPA has five parameters: handshape, location, movements, non-manual features, and symbols for representing two-handed signs. A special feature of SLIPA is the possibility to create indices for easy referencing of previously written signs. Also, SLIPA uses ASCII and Unicode and is therefore well integrated into computer systems (ASL Font, 2013; Peterson, no date). 3.9 ASL Sign Jotting (2009) Thomas Stone developed ASL Sign Jotting (ASLSJ) in 2009. This notation system focuses on making it possible to quickly writing down signs. The consequence of this is that the accuracy of representations has lower priority (Hutchinson, 2012). ASLSJ uses ASCII symbols like SLIPA with the same benefit. There are nine parameters: handshape, handshape of the non-dominant hand, location, palm orientation, distance or contact, movements, ending handshape, non-manual features, and a parameter to show repetition. ASLSJ also allows for simple spelling of words by using the manual alphabet for ASL (Stone, no date). 3.10 SignScript (2010) SignScript was developed in 2010 by Donald Grushkin and is a notation system for ASL for public use. It consists of five parameters with 131 symbols. The parameters are handshape (46), palm orientation (5), location (12), movements (39) of the hand and non-manual features (29) (ASL Font, 2013). They appear in the representation of a sign in the order as listed before (Goyal, 2015). 3.11 Symbol Font for ASL (2013) Symbol Font for ASL (ASLFont) is a notation system for writing ASL online. The goal is to provide an easy-to-use system with great integration into computer systems. There are seven parameters. Those are handshapes (51), orientation (24), location, relative location, contact, movement, and non-manual features. Handshapes are mapped to keys on the keyboard, with numbers being numbers and lowercase letters being the equivalent in the finger alphabet. Capital letters are mapped with additional handshapes. Directions, orientation, and some other symbols, which create new meanings for other symbols if put in context, are mapped to special characters on the keyboard (ASL Font, 2013). 4. Comparison & Discussion Our goal is to recognise ASL signs on a technical basis. For this purpose, we want to decompose the signs into their characteristic features (parameters). To further specify, we first precisely define the selected vocabulary. Therefore, we consider two sources: First, we investigate the book Concise American Sign Language Dictionary (Costello, 1999), describing 4,500 different ASL signs. Second, we examine the American Manual Alphabet, shown in Figure 3 on the left side, which is one-handed in ASL. It is necessary to represent words that are not available as signs, e.g., names or places (Costello, 1999). Therefore, it is indispensable for everyday communication.
Figure 3: American Manual Alphabet, also known as finger alphabet, used for fingerspelling (left) and additional used handshapes in Concise American Sign Language Dictionary (Costello, 1999). The handshape is certainly a sign's most characteristic parameter. Looking at the finger alphabet, for example, we see that most of the 26 signs differ only in the shape of the hand. Few signs vary by orientation (h and u, k and p, g and q), and only two signs are moved and thus differentiate from their equivalent handshapes (j to i and z to d). The dictionary we studied uses nine additionally handshapes (Figure 3, right side). These, plus the remaining 21 handshapes from the finger alphabet, lead to 30 different handshapes that need to be recognised and differentiated. Table 2 shows these handshapes for all the notation systems we have presented. We have outlined all symbols that are used more than once by a notation system. This implies that our approach would not be able to distinguish between these handshapes, and thus, the learning app would not be able to give qualitative feedback here. As demonstrated, there are duplications of the symbols in Stokoe, ASL-phabet, ASLSJ and SignScript, which is why they are not recommended for our approach. In comparison, the other notation systems show sufficient details to differentiate between the required handshapes. As shown in Table 1, SignWriting, HamNoSys and SignFont have a high number of different handshapes, leading to a higher technical effort in the later data acquisition and classification. The remaining ASL Orthography, si5s/ASLwrite, SLIPA and ASLFont have enough different handshapes (46 to 65) to recognise all 30 handshapes we require. It should be emphasised that SLIPA is the only notation system that can be used internationally. Furthermore, it is (currently) exceptionally challenging to obtain further information regarding si5s/ASLWrite and ASL Orthography. As a result, we deem SLIPA and ASLFont are particularly well-suited for our approach. In future work, however, we would like to investigate further parameters such as location, orientation and movement and also consider these in our evaluation. 5. Conclusion In this paper, we have presented and compared in detail eleven sign notation systems. Each of these systems is able to parameterise ASL signs but uses a different number of parameters and parameter values (symbols). We examined the handshape as probably the most specific parameter of sign language and compared the handshape symbols of the notation systems. With these, we wanted to clearly distinguish a given set of 30 handshapes used in ASL. Four of the notation systems could not clearly distinguish the handshapes because they used the same symbol for several handshapes. Three other notation systems could clearly distinguish the handshapes but had a very high number of symbols, which we wanted to prevent for technical reasons. Of the remaining four notation systems, two were unfortunately no longer available. The two notation systems Sign Language IPA (SLIPA) and Symbol Font for ASL (ASLFont), were the most suitable for our project. In the future,
we want to investigate more parameters to find out which sign notation system is best suited to parameterise ASL signs and serve as the basis for an ASL Learning App. Table 2: Comparison of symbols of different notation systems for all handshapes we require for our approach. Bordered fields indicate multiple uses of the same symbol (Stokoe, Casterline and Croneberg, 1976; ASL Font, 2013). ¹: Stokoe special feature for extended thumb (or another non-prominent finger) ²: Stokoe special feature for bent fingers
References Adam Szczegielniak (2013) ‘Phonetics: The Sounds of Language’. Available at: https://scholar.harvard.edu/files/adam/files/phonetics.ppt.pdf (Accessed: 8 May 2021). ASL Font (2013) ASL Font: Symbol Font for ASL. Available at: http://aslfont.github.io/ (Accessed: 4 May 2021). Aslphabet.com is for sale (2021) HugeDomains. Available at: https://www.HugeDomains.com/domain_profile.cfm?d=Aslphabet.com (Accessed: 10 May 2021). aslwrite.com (no date). Available at: http://www.aslwrite.com/ (Accessed: 10 May 2021). 'Brief Comparison of ASL Writing Systems' (2021) Wikipedia. Available at: https://en.wikipedia.org/wiki/File:Brief_Comparison_of_ASL_Writing_Systems.jpg (Accessed: 9 May 2021). Cambridge University Press (no date a) Definition of Grapheme. Available at: https://dictionary.cambridge.org/de/worterbuch/englisch/grapheme (Accessed: 8 May 2021). Cambridge University Press (no date b) Definition of Phoneme. Available at: https://dictionary.cambridge.org/de/worterbuch/englisch/phoneme (Accessed: 8 May 2021). Clark, A. (2012) How to write American Sign Language. ASLwrite. Costello, E. (1999) Random House Webster's Concise American Sign Language Dictionary. Random House. David F. Armstrong, Michael A. Karchmer and John Vickrey Van Cleve (2002) The Study of Signed Languages: Essays in Honor of William C. Stokoe. Available at: https://www.bibliovault.org/BV.book.epl?ISBN=9781563681233 (Accessed: 9 May 2021). 'Dawnstar Blog – Just another WordPress site' (2018). Available at: http://dawnstar.org/ (Accessed: 10 May 2021). Eiffert, S. E. (2012) Samuel Supalla and the ASL-phabet. Master's Thesis. The University of Arizona. Everson, M. et al. (2012) 'Proposal for encoding Sutton SignWriting in the UCS'. Fricke, E. and Bressem, J. (2020) Gesten - gestern, heute, übermorgen. Vom Forschungsprojekt zur Ausstellung. Chemnitz: Universitätsverlag Chemnitz. Available at: https://nbn-resolving.org/urn:nbn:de:bsz:ch1-qucosa2- 339590. Goyal, L. (2015)' Review and Comparison of Writing Notations of Sign Language', International Journal of Engineering Sciences (IJoES). Grushkin, D. A. (2017) 'Writing Signed Languages: What For? What Form?', American Annals of the Deaf, 161(5), pp. 509–527. doi: 10.1353/aad.2017.0001. Hanke, T. (2004) 'HamNoSys-representing sign language data in language resources and language processing contexts', in LREC, pp. 1–6. Hochgesang, J. A. (2014) 'Using design principles to consider representation of the hand in some notation systems', Sign Language Studies, 14(4), pp. 488–542. Hutchinson, J. J. (2012) 'Analysis of Notation Systems for Machine Translation of Sign Languages', Unpublished Honours Dissertation] Grahamstown: Rhodes University. Liang, R.-H. and Ouhyoung, M. (1998) 'A real-time continuous gesture recognition system for sign language', in Proceedings third IEEE international conference on automatic face and gesture recognition. IEEE, pp. 558–567. Martin, J. (2000) A Linguistic Comparison, A Linguistic Comparison - SignWriting For Sign Languages. Available at: http://www.signwriting.org/archive/docs1/sw0032-Stokoe-Sutton.pdf. Peterson, D. J. (no date) SLIPA: An IPA for Signed Languages. Available at: http://dedalvs.com/slipa.html#indexing. Sample Words - ASLSJ (2021). Available at: http://www.aslsj.com/9-sample-words (Accessed: 9 May 2021). ScriptSource - SignFont Notation (2021). Available at: https://scriptsource.org/cms/scripts/page.php?item_id=script_detail&key=Qaao (Accessed: 8 May 2021). Si5s.org (2021). Available at: http://ww1.si5s.org/ (Accessed: 10 May 2021). Sign Language IPA (2021). Available at: https://dedalvs.com/slipa.html (Accessed: 9 May 2021). SIL International (2021) Ethnologue Languages of the World - American Sign Language. SIL International. Available at: https://www.ethnologue.com/language/ase/23 (Accessed: 4 February 2021). Sternberg, M. L. and Sternberg, M. (1998) American Sign Language Dictionary-Flexi. HarperResource. Stokoe, W. (1960) Sign language structure. 1978. Linstok Press, Silver Spring, MD. Stokoe, W. C., Casterline, D. C. and Croneberg, C. G. (1976) A dictionary of American Sign Language on linguistic principles. Linstok Press. Stone, T. (no date) ASLSJ, ASLSJ. Available at: http://www.aslsj.com/ (Accessed: 9 May 2021). Supalla, S., McKee, C. and Cripps, J. (2014) 'An Overview on the ASL-phabet', Gloss Institute's Monograph Series, 1, pp. 1–18. Sutton, V. (2014) Lessons in sign writing: Textbook. SignWriting.
Sutton, V. and Frost, A. (2008) 'Signwriting: sign languages are written languages', Center for Sutton Movement Writing, CSMW, Tech. Rep. Tennant, R. A., Gluszak, M. and Brown, M. G. (1998) The American sign language handshape dictionary. Gallaudet University Press. United Nations News (2019) Sign language protects 'linguistic identity and cultural diversity' of all users, says UN chief. United Nations. Available at: https://news.un.org/en/story/2019/09/1047012 (Accessed: 26 April 2021). Valli, C. (2006) The Gallaudet dictionary of American sign language. Gallaudet University Press. Waterfield, S. (2019) ASL Day 2019: Everything You Need To Know About American Sign Language. Newsweek. Available at: https://www.newsweek.com/asl-day-2019-american-sign-language-1394695 (Accessed: 30 April 2021). World Health Organization (2021) Deafness and hearing loss. World Health Organization. Available at: https://www.who.int/news-room/fact-sheets/detail/deafness-and-hearing-loss (Accessed: 26 April 2021).
You can also read