Patent Translate Powered by EPO and Google Notice This translation is machine-generated. It cannot be guaranteed that it is intelligible, accurate, complete, reliable or fit for specific purposes. Critical decisions, such as commercially relevant or financial decisions, should not be based on machine-translation output. DESCRIPTION JP2010134674 The present invention provides a voice guiding system for voice guidance of which direction the destination is with respect to the direction in which the user is facing at that time. A guide plate 4 is installed at an intersection of a street. The guide plate 4 transmits guide information including categories and directions of a plurality of guide objects in a specific direction in the ultrasonic band. The passersby 10, 11, 12 are each equipped with a guide terminal which is a receiving device. The guide terminal receives a selection of a desired category from the user (passer). Further, the guide terminal detects the direction in which the device (passer) is facing based on the arrival direction of the guide information of the ultrasonic wave. The guide terminal selects a category desired by the user from the received guide information, and synthesizes a word representing the difference (relative direction) between the direction of the guide object and the direction in which the user faces. Reproduce. [Selected figure] Figure 1 Voice guidance device and voice guidance system [0001] The present invention relates to a voice guiding device and a voice guiding system that reproduces a guide voice that mainly indicates the direction of a pedestrian's destination. [0002] On the streets, shopping streets, etc., a guide plate is provided to guide nearby facilities. 04-05-2019 1 Generally, the information board has a map including the current location and facilities. However, in the case of a map, the relationship between the display direction on the map and the actual direction is often difficult to understand for general passersby. Also, there is a problem that the map is not useful for the blind. [0003] Therefore, a voice guidance device as disclosed in Patent Document 1 has been proposed. This voice guidance device is a device that, when the selection switch corresponding to the facility displayed on the map on the display board (touch panel) is turned on, the route guidance to the facility is performed by voice. [0004] Registered Utility Model Publication No. 3028958 [0005] However, even with the device of Patent Document 1, the relationship between the display direction on the map and the actual orientation is hard to understand as in the conventional map, and the facility selected by the user is the user Even if the facility is not in the desired direction, there is a problem that the route of the facility will be shown. In addition, right hand and left hand etc. of the guidance voice are guided as the right hand and left hand direction of the person facing the guidance device, and the user is not always facing the guidance device and operating the device Since this is not limited, there is a possibility that the direction indication of the voice guidance becomes inaccurate. Furthermore, with the device of Patent Document 1, while one person was using it, it could not be used by another person, and could not be used simultaneously by a large number of people. [0006] Therefore, the present invention provides a voice guiding device and a voice guiding system in which the above problem is solved by voice guidance as to which direction the destination is with respect to the direction in which the user is facing at that time. The purpose is to 04-05-2019 2 [0007] The invention of claim 1 receives the selection of the category by the user, the reception means for receiving the guide information including the direction information of the guide object and the category information, the direction detection means for detecting the direction in which the own device is facing, and Category selection means, information extraction means for extracting the category selected by the category selection means from the guide information received by the reception means, azimuth information of the guide information extracted by the information extraction means And a reproduction unit configured to synthesize and reproduce a word representing the orientation of the own device detected by the orientation detection unit and the difference. [0008] The invention according to claim 2 is that the receiving means is means for receiving a broadcast transmitted by ultrasonic waves, and the azimuth detecting means is an azimuth in which the own device is directed based on the arrival direction of the ultrasonic waves. To detect. [0009] The invention of claim 3 comprises: the voice guide device according to claim 2; guide information storage means for storing guide information including orientation information of the guide object and category information; and superimposing the guide information on ultrasonic waves. And transmitting means for broadcasting. [0010] According to the present invention, the wording (for example, the diagonal right) representing the difference between the direction of the voice guiding device, that is, the direction of the user holding the voice guiding device and the direction of the target guiding object is reproduced. Whichever direction the user is facing, it is possible to give instructions in accordance with the direction, and it becomes possible to accurately guide the user to reach the target object in the direction set around him / her. . [0011] A voice guidance system which is an embodiment of the present invention will be described with reference to the drawings. [0012] 04-05-2019 3 FIG. 1 is a plan view of a street on which the guide plate 4 of the voice guide system is installed. The street is north at the top of the drawing as shown by the azimuth symbol 5. This street is a T-shaped road formed by a road 1 extending from east to west and a road 2 intersecting the road 1 from the south. A guide plate 4 is installed on the wall of the north face of the intersection 3 of the T-shaped road. A speaker 37 (see FIG. 3) is installed on the guide plate 4, and from this speaker 37 modulated voice for voice guidance is emitted southward. [0013] Moreover, in this figure, three passersby 10, 11, 12 pass this intersection 3. These passersby 10, 11, 12 are equipped with a guide terminal 20 (see FIG. 2) for reproducing the guide voice. The guide terminal 20 is a headset-like terminal device as shown in FIG. [0014] The guide terminal 20 worn by each passerby 10, 11, 12 receives the modulated sound emitted by the speaker 37, detects the direction of the self based on this modulated sound, and the user (passerer) Play a guide voice that indicates the direction of the facility (target) you are aiming for. [0015] 04-05-2019 4 FIG. 2 is an external view of a guide terminal 20 worn by a passerby. The guide terminal 20 is shaped like a stereo headphone, is provided with small speakers 21 (21L, 21R) on both ears, and three microphones 23, 24, 25 on the head strap 22 connecting the speakers 21L, 21R of both ears. Is provided. The microphone 23 is provided on the left side, the microphone 24 is provided on the right side, and the microphone 25 is provided on the center front. That is, when the microphones 23, 24, 25 are connected by a straight line, it becomes an isosceles triangle with the microphone 25 as a vertex. Based on the time shift of the modulated signals received by the three microphones 23, 24, 25, it is detected in which direction the guide terminal 20, ie, the user is facing. Further, the guide terminal 20 includes a category selection unit 55 (see FIG. 5) for receiving a category selection operation of the user. The category selection unit 55 may be provided on the headphones, or may be provided separately from the headphones, for example, as a remote control. The functional configuration of the guide terminal (FIG. 5) and the method of detection (FIG. 6) will be described later. [0016] FIG. 3 is a block diagram of the guide plate 4. The appearance of the guide plate is a panel shape on which a map, guidance of peripheral facilities, etc. are described, and it is possible to know facilities and roads in the vicinity also by looking at the map and the like. Inside the panel, a plurality (12 in this figure) of speakers 37 are arranged in a line in the front direction (south direction). It is preferable that the height of the installation of the speaker 37 be substantially the same as or slightly higher than the height of the guide terminal 20 worn by the passersby 10, 11, 12. When the height of the speaker 37 is the same height as the guide terminal 20, the accuracy of the azimuth detection by the guide terminal 20 is the highest. On the other hand, by disposing the speaker 37 slightly higher than the height of the guide terminal 20, that is, the height of the passerby, the emitted modulated sound propagates far without being blocked by the passerby or the like. Therefore, the speakers 37 are installed in a horizontal row at a height of about 150 to 200 cm from the ground. [0017] By arranging a plurality of loudspeaker units in a straight line and emitting the same voice at the same time (with the same phase), the propagation in the oblique direction is canceled by interference, and as shown by the arrow in FIG. Parallel sound propagation can be realized. 04-05-2019 5 [0018] The guide plate 4 has a function of a transmitting device for transmitting in parallel the modulated sound of a plurality of channels (four channels in this embodiment). A memory 31, a read unit 32, a PN signal generator 34, and a modulator 33 are provided for each of channel 1 (ch1) to channel 4 (ch4). Control information and object guide information are stored in the guide voice memory 31 of each channel. The PN signal generator 34 of each channel generates PN signals of different code sequences. The chip rate of the PN signal is in the ultrasonic range (e.g. 88.2 kHz). [0019] In each channel, the PN signal (modulated audio signal) modulated with the guide information is added and synthesized by the adder 35. The modulated voice signal that has been added and synthesized is amplified by the audio amplifier 36 and emitted by the speaker 37. [0020] FIG. 4A is a view showing an example of the storage content of the memory 31 of the channel 1 and FIG. 4B is a view showing an example of the storage content of the memory 31 of the channel 2. As shown in FIG. 6A, the memory 31 of the channel 1 stores control information 100 and guide information 101 for guiding an object of category 7. The control information 100 includes direction control information in which the direction setting of the guide broadcast of the guide plate 4 is written, and category information of each channel. The sound emission direction information is information indicating the sound emission direction of the speaker 37. The azimuth (direction) is expressed by an absolute azimuth (degree) measured clockwise from north (0 degree). In the case of this embodiment, since the speaker 37 is installed southward and emits modulated sound southward, the emission azimuth is 180 (degrees). [0021] 04-05-2019 6 Further, category information is information indicating a category of guide information being broadcasted on each channel. That is, the guide plate 4 distributes the transmission channels according to the category of the object and transmits the guide information. In this embodiment, guide information of categories 1 and 2 is transmitted on channel 2, guide information of categories 3 and 4 is transmitted on channel 3, and guide information of categories 5 and 6 is transmitted on channel 4. Then, the guide information of category 7 is transmitted on channel 1 for transmitting control information 100. [0022] Here, the category is information representing the type of the object, and the users having the guide terminal 20 (passers) have a high possibility of being selected as a common destination when searching with a purpose It is a classification. The purpose of the user is "I want to travel by transportation", "I want to sightseeing", "Find a building", etc. For example, category numbers are attached to each category (purpose), such as 1: transport, 2: tourist destination,... 7: building etc. [0023] The number of categories that each channel handles is not limited to 1 or 2. In the case where one channel receives multiple categories, the allocation may be in the order of category numbers, and based on the number of objects belonging to each category, the number of guide information transmitted by each channel will be the same. You may adjust. [0024] Since the guide terminal 20 first receives channel 1 and calculates its own heading, by including control information in the broadcast contents of channel 1, calculation of the absolute heading of the guide terminal 20 and switching of the guide channel are smooth. To be done. [0025] According to the above allocation, the guide information 101 for guiding the object of category 7 is stored in the memory 31 of channel 1 and the guide information 102 for guiding the object of categories 1 and 2 is stored in the memory 31 of channel 2 104 is stored. 04-05-2019 7 [0026] Each guide information includes contents including an object ID, an object orientation, a category, a distance, a priority, and an object name. The object ID is a numerical value identifying this guide object (facility). The object orientation is the orientation from the guide position, that is, the installation position of the guide plate 4 to the object, and is the absolute orientation (degree) measured clockwise with respect to north (0 degree) as in the emission orientation information. Is represented by The distance is the distance from the installation point of the guide plate 4 to the guide object. The priority is information indicating the importance of the guide target. When a plurality of guide objects are stored in the memory 31, the information is read in order from the one with the highest priority. The object name is text information of the name of the guide object. When synthesizing the guide, this text information is synthesized into speech. The object name may be stored as an audio signal. [0027] In addition, although the azimuth ¦ direction of each guide target object is memorize ¦ stored by the absolute azimuth ¦ direction in this embodiment, you may represent with the relative azimuth ¦ direction from the sound emission azimuth ¦ direction. [0028] The reading unit 32 of each channel sequentially and repeatedly reads out the above information stored in the guide information memory 31 and inputs the information to the modulator 33. [0029] FIG. 5 is a block diagram of the guide terminal 20. As shown in FIG. Signals collected by the omnidirectional microphones 23, 24, 25 attached to the head strap are demodulated by the demodulators 41, 42, 43, respectively. 04-05-2019 8 The PN signal of channel 1 generated by the PN signal generator 40 is input to each of the demodulators 41, 42, 43. Each demodulator 41, 42, 43 shifts this PN signal on the time axis to detect a synchronization point with an audio signal (modulated signal) picked up by the microphones 23, 24, 25 and timing information of the synchronization point Are input to the direction calculation unit 44. Further, the demodulator 43 demodulates the data (control information) of the channel 1 included in the audio signal collected by the microphone 25, and inputs the demodulated control information to the direction calculation unit 44 and the channel selection unit 45. The reason for demodulating direction control information from the sound signal collected by the microphone 25 is that the microphone 25 is provided at the center of the strap 22 and therefore, it is considered that the sound collection characteristic is better than the other microphones 23 and 24. It is from. [0030] The direction calculation unit 44 calculates the relative direction of the guide terminal 20 with respect to the sound emission direction of the speaker 37 of the guide plate 4 based on the time difference between the synchronization points of the PN signals of channel 1 collected by the three microphones 23, 24 and 25. The direction of the guide terminal 20 is calculated by calculating the angle Δθ and adding the sound emission direction of the speaker 37 to the relative angle Δθ. The sound emission direction of the speaker 37 is included in the control information of the channel 1. [0031] A method of detecting the relative angle Δθ of the guide terminal 20 will be described with reference to FIG. In this description, Δθ is handled within the angle range of −90 ° ≦ Δθ ≦ 270 ° , but by appropriately adjusting 360 ° of one cycle angle, 0 ° ≦ Δθ ≦ 360 ° It is possible to convert into an angular range. [0032] The time difference between the same starting points of the sound signals collected by the microphones 23, 24, 25 corresponds to the distance on the sound emission direction axis of the speaker 37. FIG. 6A illustrates the distance based on this time difference. A distance based on the 04-05-2019 9 time difference between the synchronization timing T24 of the microphone 24 and the synchronization timing T23 of the microphone 23 is taken as ΔL. Further, the distance based on the time difference between the synchronization timing T24 of the microphone 24 and the synchronization timing T25 of the microphone 25 is taken as ΔR. That is, ΔL = (T23−T24) × V ΔR = (T25−T24) × V (V is the speed of sound). Note that ΔL has a negative value when T23 <T24, and ΔR has a negative value when T25 <T24. Further, the distance between the microphones 23 and 24 is L. Then, the relative angle Δθ between the sound release direction and the direction of the guide terminal 20 is expressed by sin Δθ = ΔL / L, and Δθ is Δθ = sin <−1> (ΔL / L) (Equation 1) Calculated by [0033] However, Equation 1 holds in the range of −90 ° ≦ Δθ ≦ 90 ° (Expression 2). That is, this holds true when the guide terminal 20 faces a range of 180 ° centered on the sound emission direction (south) of the speaker 37. When the relative angle Δθ is 90 ° ≦ Δθ ≦ 270 ° (Expression 3), Δθ = sin <−1> (ΔL / L) + 180 ° (Expression 4). Whether Δθ is in the range of Formula 1 or in the range of Formula 2 can be determined by the magnitude relationship between ΔR and ΔL / 2. ΔL / 2 corresponds to the distance on the sound emission direction axis of the middle point M of the distance between the microphones 23 and 24. [0034] As shown in the figure (B), ΔR> ΔL / 2 when −90 ° <Δθ <90 ° and ΔR <ΔL / 2 when 90 ° <Δθ <270 ° . By this determination, it is possible to distinguish two postures having the same ΔL but different directions as shown in FIG. [0035] The absolute orientation of the guide terminal 20 can be calculated by adding the sound emission angle (sound emission direction) of the speaker 37 to the relative angle Δθ. The direction of the guide terminal 20 calculated by the direction calculating unit 44 is input to the word editing unit 50. [0036] 04-05-2019 10 On the other hand, control information, in particular, guide category information of each channel, is input from the demodulator 43 to the channel selection unit 45, and a category selected by the user is input from the category selection unit 55. The channel selection unit 45 selects a channel guiding the category selected by the user, and outputs the channel selection information to the PN signal generator 46. For example, when the passerby 10 in FIG. 1 is searching for a transportation facility such as a train station, the category selection unit 55 is operated to select a transportation facility (category 1). Then, the channel selection unit 45 selects the channel 2. [0037] Channel selection information is input to the PN signal generator 46. The PN signal generator 46 generates a PN signal of the input channel. This PN signal is input to the demodulator 47. The demodulator 47 receives the audio signal picked up by the microphone 25 and synchronizes with the PN signal generated by the PN signal generator 46 from the audio signal to demodulate the data of the channel. The demodulated data includes guide information of the category selected by the user. The demodulated guide information is stored in the buffer 48. [0038] Among the guide information stored in the buffer 48, the one according to the user's request is selected by the data selection unit 49. If one piece of guide information matching the category selected by the user is stored in the buffer 48, it is selected. When a plurality of pieces of guide information matching the category selected by the user are stored in the buffer 48, the one having a short distance and / or one having a high priority is selected and reproduced. In this case, one with the closest distance and / or one with the highest priority may be selected and guided, and a plurality of guide information may be ordered in the order of distance and / or in the order of high priority. You may make it reproduce ¦ regenerate sequentially. [0039] The guide information selected by the data selection unit 49 is read by the word editing unit 50. The word editing unit 50 edits the guide word based on the guide information and the template. Guide wording, for example, "There is [A Station] at [300] meters ahead of," of your [back]. It is a 04-05-2019 11 word like "". The guide information includes information such as the object orientation 95 , the distance 300 , the name of the object A station and the like. There is ○○ in the meter ahead. Apply to the template "" and edit the guide words. This template is stored in the template memory 51. [0040] The direction calculated by the direction calculation unit 44 is subtracted from the direction of the object to obtain a relative direction φ (degree). The expression of the direction of "your ○ ○ (direction)" is determined, for example, as follows. If the relative azimuth φ is -10 ≦ φ ≦ 10, the "front direction" [0041] If the relative orientation φ is 10 <φ ≦ 30, the front slightly right hand direction If the relative orientation φ is 30 <φ ≦ 60 the right oblique direction If the relative orientation φ is 60 <φ ≦ 80 "Right hand slightly forward" If the relative direction φ is 80 <φ 100 100 "right hand direction" If the relative direction φ is 100 <φ 150 150 "right rear" [0042] If the relative orientation φ is -10> φ ≧ -30, the "front slightly left direction" The relative orientation φ is -30> φ ≧ -60 the "left diagonal direction" The relative orientation φ is -60> φ "Left hand slightly forward" if> -80, "left hand direction" if -80> φ ≧ -100 if relative orientation φ, "left rear" if -100> φ ≧ -150 [0043] If the relative azimuth φ is −150> φ ≧ −180 or 150 <φ ≦ 180, immediately behind [0044] Since the human has coarser angular resolution at the rear than at the front within the view range, the division of the rear direction indication is rough. Note that since the synthesis of the words by the word editing unit 50 is performed repeatedly at 04-05-2019 12 regular time intervals, even if the words that the target object is behind are reproduced, the user will look forward when looking back at the user. The words will then be played back. [0045] Note that the wording that indicates the direction of the object is not limited to this. It may be expressed as a relative angle such as "right 30 degrees", "You are facing northwest. The object is in the north-northeast. And so on. [0046] In addition, when there is no guide subject that matches the category selected by the user, the wording that there is no desired guide subject is reproduced and the above guide word is not reproduced. [0047] The guide words edited by the word editing unit 49 are input to the speech synthesis unit 52. The speech synthesis unit 52 synthesizes this guide word into a digital speech signal. A known method may be used as the combining method. The synthesized digital voice signal is converted to an analog voice signal by the D / A converter 53, amplified by the audio amplifier 54, and emitted from the speaker 21 of the headset toward the user's (passer) 's ear Ru. [0048] As described above, the direction detection unit 44 and the word editing unit 50 operate at all times, and when the user changes the direction, the direction indication word of the guide voice also changes according to the change in the direction. Therefore, the passerby wearing the guide terminal 20 can find out the correct direction of the guide object by changing the direction while 04-05-2019 13 listening to the direction indication word of the guide voice. [0049] The number of transmission channels of the guide plate 4 is not limited to four. It may be more or less. Also, guide information of all categories may be transmitted on one channel. [0050] Further, in this embodiment, a plurality of channels are superimposed on the same frequency band by making code sequences of PN signals different, but a plurality of channels may be distributed in different frequency bands using frequency modulation or the like. . [0051] The voice guide system of this embodiment is configured as a system for installing a guide plate 4 on a street to provide a route guide to passersby, but the voice guide system of the present invention is also applicable to other applications. Is possible. For example, it can be applied to a viewing guide for each direction in a viewing platform or the like, a route guide to an attraction in an amusement park or the like, a section guidance in a commercial facility, and the like. [0052] Further, in this embodiment, the device for transmitting the modulated voice (guide information) is a guide plate 4 that displays a map or the like and visually guides the passerby, but it may not be a guide plate. . [0053] The figure which shows the street where the voice guidance system which is the embodiment of this invention is installed The outline view of the guide terminal of the voice guidance system The block diagram of the guidance board of the voice guidance system The composition of the guide information of the voice guidance system Block diagram of the guide terminal Diagram for explaining the method of relative angle calculation in the guide terminal 04-05-2019 14 Explanation of sign [0054] 1, 2 ... Road 3 ... Intersection 4 ... Information board 10, 11, 12 ... Passerby 20 ... Guide terminal 04-05-2019 15
© Copyright 2021 DropDoc