US20190036719A1 - Connecting physical resources to virtual collaboration meeting - Google Patents
Connecting physical resources to virtual collaboration meeting Download PDFInfo
- Publication number
- US20190036719A1 US20190036719A1 US15/660,386 US201715660386A US2019036719A1 US 20190036719 A1 US20190036719 A1 US 20190036719A1 US 201715660386 A US201715660386 A US 201715660386A US 2019036719 A1 US2019036719 A1 US 2019036719A1
- Authority
- US
- United States
- Prior art keywords
- network connected
- connected device
- graphical representation
- map
- database
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000004044 response Effects 0.000 claims abstract description 14
- 238000000034 method Methods 0.000 claims description 28
- 230000002596 correlated effect Effects 0.000 claims description 4
- 238000013507 mapping Methods 0.000 description 49
- 238000013475 authorization Methods 0.000 description 12
- 230000000875 corresponding effect Effects 0.000 description 11
- 238000004891 communication Methods 0.000 description 5
- 238000012544 monitoring process Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000006855 networking Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000004378 air conditioning Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000009423 ventilation Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
- H04L12/18—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
- H04L12/1813—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
- H04L12/1818—Conference organisation arrangements, e.g. handling schedules, setting up parameters needed by nodes to attend a conference, booking network resources, notifying involved parties
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/951—Indexing; Web crawling techniques
-
- G06F17/30241—
-
- G06F17/30864—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
- H04L67/125—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
Definitions
- the present disclosure relates to connecting network connected devices to a collaboration meeting.
- the Internet of Things is a general term used to describe the addition of networking capabilities to physical objects or “things” that serve some purpose or function outside of computing and/or networking technologies (i.e., traditionally “unconnected” or “offline” devices), such as thermometers, refrigerators, lights, wristbands, and sensors.
- these “things,” sometimes referred to as IoT enabled-devices, IoT devices, or special purpose network connected devices are embedded with electronics, software, and network interfaces, which enables the physical objects to send and/or receive data packets over a network.
- FIG. 1 is an overview diagram of a collaboration environment, according to an example embodiment.
- FIG. 2 is a graphical user interface of a collaboration application, according to an example embodiment.
- FIG. 3 is another graphical user interface of a collaboration application, according to an example embodiment.
- FIG. 4 is a mapping database, according to an example embodiment.
- FIG. 5 is a diagrammatic illustration of communications between network connected devices and a collaboration application, according to an example embodiment.
- FIG. 6 is a flowchart of a method for dynamically adding a network connected device to a collaboration meeting, according to an example embodiment.
- FIG. 7 is a flowchart of a method for mapping network connected devices, according to an example embodiment.
- FIG. 8 is a block diagram of a collaboration server configured to execute network connected device mapping techniques, according to an example embodiment.
- FIG. 9 is a flowchart of a generalized method in accordance with examples presented herein.
- a collaboration application presents a map of a geographic area that includes a network connected device.
- the map includes a graphical representation of the network connected device.
- the graphical representation has a location on the map that corresponds to a current physical location of the network connected device in the geographic area.
- the collaboration application receives an indication of a user selection of the graphical representation, and, in response to receiving the indication, determines that the user selection of the graphical representation corresponds to a user selection of the network connected device.
- the collaboration application sends, to the network connected device, an invitation to join a collaboration meeting as a peer member of the collaboration meeting.
- Conventional collaboration applications may send, to human users/participants, invitations (e.g., over email) to join a collaboration meeting. These invitations may be sent via user devices (e.g., laptops, mobile phones, etc.).
- user devices e.g., laptops, mobile phones, etc.
- conventional collaboration applications generally cannot send invitations to network connected devices to join the collaboration meeting as peer participants of a collaboration meeting. Instead, conventionally, such devices may be included in a collaboration meeting by having a human act as a proxy for the devices.
- a collaboration application may dynamically add or remove peer network connected devices based on the current physical locations of the network connected devices. As will become apparent based on the following description, permitting a network connected device to join as a peer participant based on its current physical location may enhance participant experience during the collaboration meeting.
- the collaboration environment 100 includes an administrator 105 , collaboration application 110 , and a plurality of peer meeting participants 115 .
- the collaboration application 110 hosts a collaboration meeting in which peer meeting participants 115 are participating.
- the plurality of peer meeting participants 115 includes a drone 115 ( 1 ), a camera 115 ( 2 ), a sensor 115 ( 3 ), a lightbulb 115 ( 4 ), a display screen 115 ( 5 ), and a human 115 ( 6 ).
- Peer meeting participants 115 ( 1 )- 115 ( 5 ) are network connected devices.
- the human 115 ( 6 ) may be participating via another network connected device (e.g., a laptop).
- the collaboration application 110 may dynamically add or remove one or more of the peer meeting participants 115 automatically or based on user input from the administrator 105 .
- Administrator 105 may be human participant 115 ( 6 ).
- the collaboration application 110 may dynamically add or remove one or more of peer meeting participants 115 based on the respective current geographic locations of the peer meeting participants 115 .
- collaboration application 110 includes mapping logic 120 to perform techniques described herein.
- the mapping logic 120 enables one or more of the peer meeting participants 115 to connect to a collaboration meeting dynamically (e.g., without associating with a static entity or having to pass through a proxy).
- the collaboration application 110 may send invitations to peer meeting participants 115 ( 4 )- 115 ( 6 ) during the scheduling, or at the start, of a collaboration meeting.
- the collaboration application 110 may subsequently send invitations to peer meeting participants 115 ( 1 )- 115 ( 3 ) after the collaboration meeting starts.
- the collaboration application 110 may automatically send an invitation to a drone upon determining the drone has entered a geographic area (e.g., a conference room), and may remove the drone from the collaboration meeting upon determining the drone has exited the geographic area.
- the mapping logic 120 permits the collaboration application 110 to dynamically add certain network connected devices (e.g., network connected devices managed or in use by a participant, network connected devices in a certain geographical area, etc.).
- a collaboration application may permit one or more human users and/or network connected devices to communicate via a virtual meeting session.
- the graphical user interface 200 presents a map of an indoors geographic area.
- the map includes two conference rooms 205 ( 1 ), 205 ( 2 ), an open area 210 , a laboratory 215 , and another conference room 220 .
- the map also includes graphical representations of network connected devices. The graphical representations have locations on the map that correspond to current physical locations of the network connected devices in the geographic area.
- FIG. 2 provides three example collaboration meetings: a first collaboration meeting represented by arrow 225 ( 1 ), a second collaboration meeting represented by arrow 225 ( 2 ), and a third collaboration meeting represented by arrow 225 ( 3 ).
- first and second collaboration meetings 225 ( 1 ), 225 ( 2 ) illustrate region selection
- the third collaboration meeting 225 ( 3 ) illustrates clickable selection.
- the first collaboration meeting 225 ( 1 ) includes the network connected devices located in conference rooms 205 ( 1 ), 205 ( 2 ).
- a user may establish the first collaboration meeting 225 ( 1 ) by selecting (e.g., dragging a cursor over) the region spanning the graphical representation of the conference rooms 205 ( 1 ), 205 ( 2 ).
- the collaboration application determines that the user selection of the graphical representation of the network connected devices located in the graphical representations of conference rooms 205 ( 1 ), 205 ( 2 ) corresponds to a user selection of the network connected devices located in conference rooms 205 ( 1 ), 205 ( 2 ).
- the collaboration application sends, to the network connected devices, an invitation to join the first collaboration 225 ( 1 ) meeting as peer members.
- the network connected devices in conference rooms 205 ( 1 ), 205 ( 2 ) (e.g., telepresence units, microphones, etc.) dynamically join the first collaboration meeting 225 ( 1 ). This allows conference rooms 205 ( 1 ), 205 ( 2 ) to be utilized for a single collaboration meeting.
- the second collaboration meeting 225 ( 2 ) includes network connected devices located in select, physically separate regions of open area 210 .
- a celebration is occurring in open area 210 .
- a user e.g., administrator, meeting participant, etc. who is authorized to handle the network connected devices in the open area 210 may establish the second collaboration meeting 225 ( 2 ) by selecting (e.g., dragging a cursor over) the select regions of the graphical representation of the open area 210 .
- the user may first select “connected lighting” in a drop-down menu (not shown) in order to filter, from the map, network connected devices that are not connected lighting devices. The user selects (e.g., by dragging a cursor over and/or clicking) the region spanning the graphical representation of the open area.
- the collaboration application determines that the user selection of the graphical representation of the network connected devices (e.g., connected lighting) located in the select regions of the graphical representations of open area 210 corresponds to a user selection of the network connected devices located in the select regions of the open area 210 .
- the collaboration application sends, to the network connected devices, an invitation to join the second collaboration meeting 225 ( 2 ) as peer members.
- the network connected devices in the select regions of the open area 210 e.g., connected lighting
- a user may wish to cause a specific, individually owned network device (e.g., drone, router, etc.) to join a collaboration meeting for debugging purposes.
- a specific, individually owned network device e.g., drone, router, etc.
- a user wishes for a switch located in laboratory 215 to join the third collaboration meeting 225 ( 3 ).
- the drop-down menu 230 may only display devices/categories of devices that the user is authorized to access.
- the user may first select “Lab ‘A’ Switches” in drop-down menu 230 in order to filter, from the map, network connected devices that are not switches located in Lab ‘A’ (e.g., Lab ‘A’ Servers).
- the third collaboration meeting 225 ( 3 ) occurs at least partly in conference room 220 .
- the user may cause the switch to join the third collaboration meeting 225 ( 3 ) by selecting (e.g., clicking) the graphical representation of the switch.
- the collaboration application determines that the user selection of the graphical representation of the switch corresponds to a user selection of the switch.
- the collaboration application sends, to the switch, an invitation to join the third collaboration meeting 225 ( 3 ) as a peer member.
- the switch dynamically joins the third collaboration meeting 225 ( 3 ). This allows the user to, for example, address debugging issues in collaboration with others.
- Each network connected device displayed on the map may have an associated Uniform Resource Identifier (URI).
- URI Uniform Resource Identifier
- a light in an open area (open area “3”) in a building (building “9”) in a geographic location (“SJ”) may have the URI SJ/9/OA3/L22.
- the URI SJ/9/OA3 may correspond to all devices (e.g., including light 22) in open area 3.
- Open area 3 may be, for example, open area 210 as illustrated in FIG. 2 .
- Such URIs are addresses to which meeting invitations may be sent.
- the graphical user interface 300 presents a map of an outdoor geographic area.
- the map is a graphical representation of a campus that includes security cameras.
- the map includes graphical representations 305 ( 1 )- 305 ( 5 ) of the cameras.
- the locations of the graphical representations 305 ( 1 )- 305 ( 5 ) of the security cameras on the map correspond to the physical locations of the cameras in the campus.
- FIG. 3 provides an example collaboration meeting involving a campus security team.
- a user e.g., a member of the campus security team
- a drop-down menu 315 may present, to the user, categories of network connected devices.
- the category “Cameras” includes the cameras in the region of interest 310 , for example.
- the user may select “Cameras” from the drop-down menu 315 . This causes the collaboration application to update the map to display the graphical representations 305 ( 1 )- 305 ( 5 ).
- selecting “Cameras” may cause the map to begin displaying graphical representations of security cameras, or to cease displaying graphical representations that are not graphical representations of security cameras.
- the map may update to display only the graphical representations 305 ( 1 )- 305 ( 5 ) that the user is authorized to select.
- the user may select the region 310 by, for example, dragging a cursor 320 over the region 310 .
- the collaboration application determines that the user selection of the graphical representation of the cameras 305 ( 2 )- 305 ( 4 ) corresponds to a user selection of the security cameras located in the corresponding region of campus.
- the collaboration application sends, to the cameras 305 ( 2 )- 305 ( 4 ), an invitation to join the collaboration meeting as peer members. This allows the user to, for example, view security footage from the selected cameras.
- similar use cases may exist for selecting drones, connected vehicles, sensors, etc.
- the collaboration application may allow for real-time sensor monitoring to assess a situation at a site/location during an emergency collaboration meeting. In this example, sensors may be dynamically added or removed to assess different sites.
- the drop-down menu 315 may list only categories of devices that are currently located in the region of interest 310 . For example, if there are no drones currently located in the region of interest 310 , the drop-down menu 315 may not display “drones” as a category. Thus, the drop-down menu 315 may dynamically update based on the current geographical location of the network connected devices in the region of interest 310 . The drop-down menu 315 may only display devices/categories of devices that a user is authorized to access.
- a collaboration application may determine that a user selection of a graphical representation corresponds to a user selection of a network connected device. To make this determination, a collaboration application may maintain a mapping database that correlates the physical location and the graphical representation of the network connected device.
- An example mapping database 400 is shown in FIG. 4 .
- the mapping database 400 includes a device field 405 , an authorization field 410 , a physical location field 415 , a map coordinates field 420 , a hostname field 425 , an Internet Protocol (IP) address field 430 , an interface field 435 , and a protocol field 440 .
- IP Internet Protocol
- the device field 405 provides information relating to the type of network connected device.
- “camera_parking” in Row A of the mapping database 400 may refer to a security camera in the region of interest in FIG. 3 .
- the authorization field 410 provides information relating to the human(s) who are authorized to configure/use/invite the corresponding network connected device.
- the authorization field 410 in Row A may indicate that any member of the security team for building 23 is authorized to send an invitation to the camera.
- the physical location field 415 indicates the physical location of the corresponding network connected device.
- the physical location of the camera is parking lot 1 of building 23.
- the physical location field 415 may dynamically update based on the current physical location of the network connected devices. This may involve the collaboration application determining a subsequent physical location of a network connected device in a geographic area. For instance, the physical location of a drone may change because drones are mobile network connected devices. Thus, if the mapping table 400 were to include information relating to a drone, the physical location field 415 for the drone would update/change dynamically (e.g., periodically, in response to a stimulus/message/notification, etc.). Fixed devices (e.g., security camera 305 ( 4 )) may not change physical locations, and therefore the physical location field 415 for such devices may not dynamically update.
- the map coordinates field 420 indicates the location of the graphical representation of a network connected device on the map. This location corresponds to the physical location of the network connected device. In the example of Row A, the map coordinates ⁇ X i , Y j ⁇ may correspond to parking lot 1 of building 23 as graphically represented on the map of FIG. 3 .
- the collaboration application may consult the mapping database 400 to determine that the location of the graphical representation of the network device (e.g., the map coordinates field 420 ) is correlated with the network connected device (e.g., the device field 405 ).
- the collaboration application may determine that the user selection of the graphical representation corresponds to a user selection of the network connected device.
- the collaboration application may query the mapping database 405 for fixed devices (e.g., pre-built into the mapping database 405 ) and/or mobile devices.
- the map coordinates field 420 may dynamically update.
- the collaboration application dynamically updates the mapping database 400 to correlate a network connected device with a subsequent physical location of the network connected device and a subsequent location of the graphical representation that corresponds to the subsequent physical location. For instance, if the physical location field 415 for a drone changes/updates, the map coordinates field 420 for the drone may dynamically update accordingly.
- the collaboration application may dynamically update the map based on the mapping database 400 . This may permit the user to dynamically add or remove network connected devices from the collaboration meeting even if the network connected devices change location.
- the hostname field 425 indicates the host name of the corresponding network connected device. As illustrated by Row A, certain network connected devices may not have a specified host name.
- the IP address field 430 indicates the IP address of the corresponding network connected device. For instance, the IP address of the camera is 10.10.10.1.
- the interface field 435 indicates the type of interface associated with the corresponding network connected device. Like the hostname field 425 , certain network connected devices may not have a specified type of interface.
- the protocol field 440 indicates the particular protocol(s) by which the collaboration application communicates with the corresponding network connected device. In this example, the security camera uses the Constrained Application Protocol (CoAP) as described in Internet Engineering Task Force (IETF) Request for Comments (RFC) 7252.
- CoAP Constrained Application Protocol
- FIG. 5 is a diagrammatic illustration 500 of example communications between network connected devices 505 ( 1 )- 505 ( 3 ) and collaboration application 110 .
- the collaboration application 110 communicates with the network connected devices 505 ( 1 )- 505 ( 3 ) via CoAP (e.g., CoAP observe).
- CoAP e.g., CoAP observe
- the network connected devices 505 ( 1 )- 505 ( 3 ) may register with the collaboration application 110 using CoAP before the start of the collaboration meeting.
- the host/user may register network connected devices 505 ( 1 )- 505 ( 3 ), which in this example are required for the collaboration meeting.
- the collaboration application 110 may determine the current physical location of a network connected device in a geographic area (e.g., via CoAP) before the meeting begins and construct the mapping database 400 to include the physical location field 415 .
- the collaboration application 110 may present the map to a user based on the mapping database 400 .
- the map permits a user to, for example, invite specified network connected devices to a scheduled or ongoing collaboration meeting.
- the user may invite network connected devices based on geography and/or association (e.g., mobile phones, user-owned network connected devices, etc.).
- the collaboration application 110 may send commands/notifications (e.g., commands with predefined instructions) to network connected devices 505 ( 1 )- 505 ( 3 ).
- the meeting participants may receive (e.g., via email) a control mechanism to override any such command or apply a new/different command.
- the collaboration application 110 may send a CoAP command/notification to network connected devices 505 ( 1 )- 505 ( 3 ) (e.g., telepresence units, connected lighting, sensors, etc.).
- the command/notification may prompt network connected devices 505 ( 1 )- 505 ( 3 ) to join the meeting automatically and perform predefined actions.
- a security camera or drone may begin streaming video to the collaboration application when the collaboration meeting begins.
- the collaboration application 110 may enable manual or automatic dynamic addition or removal of one or more network connected devices to/from the collaboration meeting.
- a user may invite/control, e.g., connected lighting, thermostats, cameras, etc.
- connected lighting may be invited to a collaboration meeting and set to a festive setting during a celebration.
- sub-groups of network connected devices may be specified.
- a sub-group may include a drone and a particular display in a meeting room.
- the drone may dynamically stream video solely on the particular display dynamically, while other displays in the meeting room may be used for other purposes (e.g., a presentation of the meeting).
- data from multiple sensors in a sub-group may be presented side-by-side in real-time during a collaboration meeting.
- sensors enables the dynamic grouping of sensors based on physical location and the dynamic creation of different groups to achieve different results.
- These sensors, or groups of sensors may be added to a collaboration meeting.
- Dynamic grouping may be based on, for example, site location, threshold bundles, faulty sensor bundles, or any other well-defined parameters.
- the collaboration application maps the sensors as well as other network connected devices (e.g., devices on the Internet), so that two groups may be dynamically added during a meeting.
- one group may bundle temperature sensors at a particular site and another group may include the heating, ventilation, and air conditioning (HVAC) system on the same site.
- HVAC heating, ventilation, and air conditioning
- the HVAC systems may be controlled to achieve a desired result. Feedback on this result may be provided immediately by the sensor bundle. It will be appreciated that sensor monitoring is just one example use case.
- FIG. 6 is a flowchart 600 of an example method for dynamically adding a network connected device to a collaboration meeting.
- the collaboration meeting starts.
- the collaboration application selects the network connected device (e.g., using a mapping database). If the network connected device is user owned, at 635 the network connected device may be manually or automatically selected from a list of authorized devices. The flow may proceed to 630 where, as mentioned, the collaboration application selects the network connected device/registers the network connected device as selected.
- the collaboration application dynamically adds (e.g., sends an invitation to) a network connected device to the collaboration meeting. The flow proceeds to 615 , where the collaboration meeting stops.
- a collaboration application may, at the start of the collaboration meeting, send invitations to network connected devices that a user registered/selected before the collaboration meeting started.
- FIG. 7 is a flowchart 700 of an example method for mapping network connected devices.
- the collaboration application acquires information (e.g., location information, authorization information, IP address, etc.) regarding the network connected devices.
- the collaboration application may communicate with the network connected devices using, for example, CoAP, H.325, Ethernet protocols, protocols for cloud based applications, etc.
- the collaboration application may also determine whether the device is to be added or removed from a collaboration meeting.
- the mapping database (e.g., mapping database 400 ) may include information of a well-defined area in a building floor that is mapped to the appropriate mapping coordinates.
- a conference room may have a range in the map from ⁇ X i , Y i ⁇ to ⁇ X n , Y n ⁇ .
- the mapping table may include map coordinates ⁇ X k , Y k ⁇ for a graphical representation of a network connected device that correspond to the physical location of the network connected device in the conference room.
- the mapping database may include manually input information (e.g., for certain fixed devices) and/or automatically discovered information (e.g., for certain mobile devices).
- the mapping database may include manually input information for a security camera, as well as information automatically obtained from a connected vehicle (e.g., via a global positioning system).
- the collaboration application detects a new network connected device.
- a mobile device such as a drone
- the collaboration application may tag the network connected device as unidentified and display a corresponding graphical representation on the map.
- the collaboration application may also/alternatively send a notification to a user/administrator about the new network connected device.
- the protocol e.g., CoAP
- the desired information e.g., information to populate the mapping database.
- the collaboration application may automatically update the mapping database (e.g., with updated map coordinates, physical location, etc.) at 725 .
- External protocols may provide other information, such as IP address, interface, port, and other desired information.
- the mapping database may also maintain this other information.
- the flow proceeds to 730 , where it is determined whether precise mapping is required. If precise mapping is not required, the user/administrator may manually provide general location information and authorization details at 735 . For example, the user may manually input Conf_Room_1 as the physical location and all_employees as the authorization information. The collaboration application may automatically allocate the next available map coordinates from the predefined range corresponding to conference room 1. Thus, the mapping database may be updated to include the desired information. External protocols may provide other information, such as IP address, interface, port, and other desired information. The mapping database may also maintain this other information. If precise mapping is required, the user/administrator may manually provide precise location information and authorization details at 740 . For example, the user may manually provide a precise location and input admin_to_vp as the authorization information.
- FIG. 8 is a block diagram of a collaboration server 800 that is configured to implement techniques presented herein.
- the collaboration server 800 includes a memory 805 , one or more processors 810 , and a network interface 815 .
- the memory 805 includes mapping logic 820 (e.g., mapping logic 120 of FIG. 1 ) and a database 825 (e.g., mapping database 400 of FIG. 4 ).
- the one or more processors 810 are configured to execute instructions stored in the memory 805 (e.g., mapping logic 820 ). When executed by the one or more processors 810 , the mapping logic 820 enables the collaboration server 800 to perform the operations associated with techniques described herein.
- the memory 805 may be read only memory (ROM), random access memory (RAM), magnetic disk storage media devices, optical storage media devices, flash memory devices, electrical, optical, or other physical/tangible memory storage devices.
- the memory 805 may comprise one or more tangible (non-transitory) computer readable storage media (e.g., a memory device) encoded with software comprising computer executable instructions and when the software is executed (by the processor 810 ) it is operable to perform the operations described herein.
- FIG. 9 is a flowchart 900 of a generalized method in accordance with examples presented herein.
- a collaboration application presents a map of a geographic area that includes a network connected device, wherein the map includes a graphical representation of the network connected device, the graphical representation having a location on the map that corresponds to a current physical location of the network connected device in the geographic area.
- the collaboration application receives an indication of a user selection of the graphical representation.
- the collaboration application determines that the user selection of the graphical representation corresponds to a user selection of the network connected device.
- the collaboration application sends, to the network connected device, an invitation to join a collaboration meeting as a peer member of the collaboration meeting.
- a collaboration application may invite physical network connected devices to a collaboration meeting on a per-meeting basis. This enables the creation of an accurate communication path between the collaboration application and the physical network connected devices.
- Physical network connected devices may be dynamically added or removed from a meeting instance across geographical locations, connected directly or logically (e.g., over the Internet). Different groups/sub-groups of network connected devices may be created for a meeting depending on the specific need. These techniques may enhance existing collaboration tools, and may provide ease of customer use for network connected devices (e.g., connected lighting, sensors, etc.) during a conference. This is a useful and convenient interface for communicating with network connected devices.
- the collaboration application described herein may create a suitable ambiance (e.g., via connected lighting). A user no longer is required to be tethered to a network connected device during a collaboration meeting.
- the collaboration application may select multiple network connected devices without knowing the identities of the network connected devices. For example, in a security team meeting, the participants may assess any situation in a particular area despite not knowing the individual identities (e.g., IP address, etc.) of security cameras. Thus, the collaboration application does not require prior knowledge of the identity of the network connected device to be dynamically added to the collaboration session.
- the physical location of network connected devices may be an important aspect of the collaboration meeting.
- Logically mapping network connected devices by, e.g., map coordinate pixels, IP address, physical location, authorization, etc. allows network connected devices to be dynamically added based purely on physical location. This eliminates the need for any new hardware controller or new wiring connections. For example, by selecting smart ceiling lights in different physical locations, the collaboration application may bring those lights under the direct control of a user/participant in the collaboration meeting. Whereas most conventional systems involve a centralized controller to controls only local devices, the collaboration application connects network connected devices that are located at a different locations and that are not necessarily connected directly to the local central controller device.
- the mapping database illustrates how an individual network connected device may join a meeting session based on authorization.
- a participant can privately or publicly connect to a particular device and provide additional input during the meeting.
- participants may monitor their individually owned drones and provide additional input or make the streaming available to other participants.
- Another use case involves monitoring or controlling devices at home (e.g., personal network connected devices) from within the meeting interface. This allows a user to, for example, respond to an emergency situation from within the meeting.
- the mapping system provides a logical way of forming groups of similar or different devices. This may be achieved via the mapping database, e.g., if physical location is important.
- the network connected devices may be classified and bundled together dynamically based on other factors, such as a well-defined threshold (e.g., all sensors above a certain temperature threshold, connected cars travelling over a specified speed limit, etc.) or an event.
- the collaboration application controls network connected devices in the context of a meeting instance rather than in the context of a particular room.
- the graphical user interface described herein is software based and accessible to every participant via a network connected device (e.g., laptops, mobile phones, etc.)
- the collaboration application may populate different network connected devices based on which network connected devices the participants are authorized to access rather than fixed set of devices tied to the central controller.
- the collaboration application may also provide access to the personal network connected devices during the meeting. In addition, these techniques are consistent with certain advertisement schemes.
- the collaboration application logically maps physical network connected devices (e.g., devices close to or in use by a user). This enables, for example, mapping sensors to an ongoing meeting and monitoring real time values to assess a situation in the field.
- the collaboration application may also provide the capability to dynamically add or remove network connected devices from a collaboration meeting that is not necessarily connected directly to the collaboration application.
- the physical devices may be associated with virtual collaboration instances. Logical sub-groups of devices may be dynamically created for different collaboration instances.
- the devices may be controlled or monitored using any underlying communication protocol.
- the collaboration application may have complete knowledge of smart ceiling lights that are joined to a particular meeting instance, which can be controlled from a meeting tool interface.
- a user interface is also provided for on-boarding and selecting devices dynamically.
- Various mappings are possible to achieve different results. For example, any of the following features may be mapped to each other (e.g., in a mapping database): the network connected device; the user/owner; the physical location; the network interface; the meeting instance; etc. Different combinations may be mapped/utilized in a meeting to achieve different results.
- device-to-device mapping may permit a drone to be dynamically mapped to specified display screens during a meeting so that the drone streams the video directly to the specified display screens.
- a method comprises: presenting a map of a geographic area that includes a network connected device, wherein the map includes a graphical representation of the network connected device, the graphical representation having a location on the map that corresponds to a current physical location of the network connected device in the geographic area; receiving an indication of a user selection of the graphical representation; in response to receiving the indication, determining that the user selection of the graphical representation corresponds to a user selection of the network connected device; and sending, to the network connected device, an invitation to join a collaboration meeting as a peer member of the collaboration meeting.
- an apparatus comprising: a network interface configured to communicate with a network connected device; a memory; and one or more processors coupled to the memory, wherein the one or more processors are configured to: present a map of a geographic area that includes the network connected device, wherein the map includes a graphical representation of the network connected device, the graphical representation having a location on the map that corresponds to a current physical location of the network connected device in the geographic area; receive an indication of a user selection of the graphical representation; in response to receiving the indication, determine that the user selection of the graphical representation corresponds to a user selection of the network connected device; and send, to the network connected device, an invitation to join a collaboration meeting as a peer member of the collaboration meeting.
- one or more non-transitory computer readable storage media are provided.
- the non-transitory computer readable storage media are encoded with instructions that, when executed by a processor, cause the processor to: present a map of a geographic area that includes a network connected device, wherein the map includes a graphical representation of the network connected device, the graphical representation having a location on the map that corresponds to a current physical location of the network connected device in the geographic area; receive an indication of a user selection of the graphical representation; in response to receiving the indication, determine that the user selection of the graphical representation corresponds to a user selection of the network connected device; and send, to the network connected device, an invitation to join a collaboration meeting as a peer member of the collaboration meeting.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Computing Systems (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Remote Sensing (AREA)
- Telephonic Communication Services (AREA)
Abstract
Description
- The present disclosure relates to connecting network connected devices to a collaboration meeting.
- The Internet of Things (IoT) is a general term used to describe the addition of networking capabilities to physical objects or “things” that serve some purpose or function outside of computing and/or networking technologies (i.e., traditionally “unconnected” or “offline” devices), such as thermometers, refrigerators, lights, wristbands, and sensors. In general, these “things,” sometimes referred to as IoT enabled-devices, IoT devices, or special purpose network connected devices, are embedded with electronics, software, and network interfaces, which enables the physical objects to send and/or receive data packets over a network.
-
FIG. 1 is an overview diagram of a collaboration environment, according to an example embodiment. -
FIG. 2 is a graphical user interface of a collaboration application, according to an example embodiment. -
FIG. 3 is another graphical user interface of a collaboration application, according to an example embodiment. -
FIG. 4 is a mapping database, according to an example embodiment. -
FIG. 5 is a diagrammatic illustration of communications between network connected devices and a collaboration application, according to an example embodiment. -
FIG. 6 is a flowchart of a method for dynamically adding a network connected device to a collaboration meeting, according to an example embodiment. -
FIG. 7 is a flowchart of a method for mapping network connected devices, according to an example embodiment. -
FIG. 8 is a block diagram of a collaboration server configured to execute network connected device mapping techniques, according to an example embodiment. -
FIG. 9 is a flowchart of a generalized method in accordance with examples presented herein. - In one example, a collaboration application presents a map of a geographic area that includes a network connected device. The map includes a graphical representation of the network connected device. The graphical representation has a location on the map that corresponds to a current physical location of the network connected device in the geographic area. The collaboration application receives an indication of a user selection of the graphical representation, and, in response to receiving the indication, determines that the user selection of the graphical representation corresponds to a user selection of the network connected device. The collaboration application sends, to the network connected device, an invitation to join a collaboration meeting as a peer member of the collaboration meeting.
- Conventional collaboration applications may send, to human users/participants, invitations (e.g., over email) to join a collaboration meeting. These invitations may be sent via user devices (e.g., laptops, mobile phones, etc.). However, conventional collaboration applications generally cannot send invitations to network connected devices to join the collaboration meeting as peer participants of a collaboration meeting. Instead, conventionally, such devices may be included in a collaboration meeting by having a human act as a proxy for the devices.
- Techniques described herein enable a collaboration application to invite a network connected device to join a collaboration meeting as a peer to a human participant. A collaboration application may dynamically add or remove peer network connected devices based on the current physical locations of the network connected devices. As will become apparent based on the following description, permitting a network connected device to join as a peer participant based on its current physical location may enhance participant experience during the collaboration meeting.
- With reference made to
FIG. 1 , acollaboration environment 100 is shown in accordance with examples presented herein. Thecollaboration environment 100 includes anadministrator 105,collaboration application 110, and a plurality ofpeer meeting participants 115. Thecollaboration application 110 hosts a collaboration meeting in whichpeer meeting participants 115 are participating. In this example, the plurality ofpeer meeting participants 115 includes a drone 115(1), a camera 115(2), a sensor 115(3), a lightbulb 115(4), a display screen 115(5), and a human 115(6). Peer meeting participants 115(1)-115(5) are network connected devices. The human 115(6) may be participating via another network connected device (e.g., a laptop). - The
collaboration application 110 may dynamically add or remove one or more of thepeer meeting participants 115 automatically or based on user input from theadministrator 105.Administrator 105 may be human participant 115(6). Further, as mentioned, thecollaboration application 110 may dynamically add or remove one or more ofpeer meeting participants 115 based on the respective current geographic locations of thepeer meeting participants 115. As explained in greater detail below,collaboration application 110 includesmapping logic 120 to perform techniques described herein. - The
mapping logic 120 enables one or more of thepeer meeting participants 115 to connect to a collaboration meeting dynamically (e.g., without associating with a static entity or having to pass through a proxy). For example, thecollaboration application 110 may send invitations to peer meeting participants 115(4)-115(6) during the scheduling, or at the start, of a collaboration meeting. Thecollaboration application 110 may subsequently send invitations to peer meeting participants 115(1)-115(3) after the collaboration meeting starts. In a further example, thecollaboration application 110 may automatically send an invitation to a drone upon determining the drone has entered a geographic area (e.g., a conference room), and may remove the drone from the collaboration meeting upon determining the drone has exited the geographic area. Thus, in an example, themapping logic 120 permits thecollaboration application 110 to dynamically add certain network connected devices (e.g., network connected devices managed or in use by a participant, network connected devices in a certain geographical area, etc.). - With reference to
FIG. 2 , shown is agraphical user interface 200 of a collaboration application in accordance with examples presented herein. A collaboration application may permit one or more human users and/or network connected devices to communicate via a virtual meeting session. Thegraphical user interface 200 presents a map of an indoors geographic area. The map includes two conference rooms 205(1), 205(2), anopen area 210, alaboratory 215, and anotherconference room 220. The map also includes graphical representations of network connected devices. The graphical representations have locations on the map that correspond to current physical locations of the network connected devices in the geographic area. -
FIG. 2 provides three example collaboration meetings: a first collaboration meeting represented by arrow 225(1), a second collaboration meeting represented by arrow 225(2), and a third collaboration meeting represented by arrow 225(3). As will be explained below, first and second collaboration meetings 225(1), 225(2) illustrate region selection, and the third collaboration meeting 225(3) illustrates clickable selection. - The first collaboration meeting 225(1) includes the network connected devices located in conference rooms 205(1), 205(2). A user may establish the first collaboration meeting 225(1) by selecting (e.g., dragging a cursor over) the region spanning the graphical representation of the conference rooms 205(1), 205(2). In response, the collaboration application determines that the user selection of the graphical representation of the network connected devices located in the graphical representations of conference rooms 205(1), 205(2) corresponds to a user selection of the network connected devices located in conference rooms 205(1), 205(2). The collaboration application sends, to the network connected devices, an invitation to join the first collaboration 225(1) meeting as peer members. Thus, the network connected devices in conference rooms 205(1), 205(2) (e.g., telepresence units, microphones, etc.) dynamically join the first collaboration meeting 225(1). This allows conference rooms 205(1), 205(2) to be utilized for a single collaboration meeting.
- The second collaboration meeting 225(2) includes network connected devices located in select, physically separate regions of
open area 210. In this example, a celebration is occurring inopen area 210. A user (e.g., administrator, meeting participant, etc.) who is authorized to handle the network connected devices in theopen area 210 may establish the second collaboration meeting 225(2) by selecting (e.g., dragging a cursor over) the select regions of the graphical representation of theopen area 210. Optionally, the user may first select “connected lighting” in a drop-down menu (not shown) in order to filter, from the map, network connected devices that are not connected lighting devices. The user selects (e.g., by dragging a cursor over and/or clicking) the region spanning the graphical representation of the open area. - The collaboration application determines that the user selection of the graphical representation of the network connected devices (e.g., connected lighting) located in the select regions of the graphical representations of
open area 210 corresponds to a user selection of the network connected devices located in the select regions of theopen area 210. In response, the collaboration application sends, to the network connected devices, an invitation to join the second collaboration meeting 225(2) as peer members. Thus, the network connected devices in the select regions of the open area 210 (e.g., connected lighting) dynamically join the second collaboration meeting 225(2). This allows the connected lighting to be utilized for the celebration. For instance, the user may select a predefined connected lighting setting (e.g., “birthday”) to cause the connected lighting to provide a celebratory ambience. - A user may wish to cause a specific, individually owned network device (e.g., drone, router, etc.) to join a collaboration meeting for debugging purposes. For example, in the third collaboration meeting 225(3), a user wishes for a switch located in
laboratory 215 to join the third collaboration meeting 225(3). The drop-down menu 230 may only display devices/categories of devices that the user is authorized to access. Optionally, the user may first select “Lab ‘A’ Switches” in drop-down menu 230 in order to filter, from the map, network connected devices that are not switches located in Lab ‘A’ (e.g., Lab ‘A’ Servers). The third collaboration meeting 225(3) occurs at least partly inconference room 220. The user may cause the switch to join the third collaboration meeting 225(3) by selecting (e.g., clicking) the graphical representation of the switch. The collaboration application determines that the user selection of the graphical representation of the switch corresponds to a user selection of the switch. In response, the collaboration application sends, to the switch, an invitation to join the third collaboration meeting 225(3) as a peer member. Thus, the switch dynamically joins the third collaboration meeting 225(3). This allows the user to, for example, address debugging issues in collaboration with others. - Each network connected device displayed on the map may have an associated Uniform Resource Identifier (URI). For example, a light (light “22”) in an open area (open area “3”) in a building (building “9”) in a geographic location (“SJ”) may have the URI SJ/9/OA3/L22. The URI SJ/9/OA3 may correspond to all devices (e.g., including light 22) in
open area 3.Open area 3 may be, for example,open area 210 as illustrated inFIG. 2 . Such URIs are addresses to which meeting invitations may be sent. - With reference to
FIG. 3 , shown is anothergraphical user interface 300 of a collaboration application in accordance with examples presented herein. Thegraphical user interface 300 presents a map of an outdoor geographic area. The map is a graphical representation of a campus that includes security cameras. The map includes graphical representations 305(1)-305(5) of the cameras. The locations of the graphical representations 305(1)-305(5) of the security cameras on the map correspond to the physical locations of the cameras in the campus. -
FIG. 3 provides an example collaboration meeting involving a campus security team. During the course of a meeting, a user (e.g., a member of the campus security team) may wish to view security footage from cameras corresponding to graphical representations 305(2)-305(4) because they are located in a specific region ofinterest 310. Optionally, a drop-down menu 315 may present, to the user, categories of network connected devices. The category “Cameras” includes the cameras in the region ofinterest 310, for example. The user may select “Cameras” from the drop-down menu 315. This causes the collaboration application to update the map to display the graphical representations 305(1)-305(5). For example, selecting “Cameras” may cause the map to begin displaying graphical representations of security cameras, or to cease displaying graphical representations that are not graphical representations of security cameras. The map may update to display only the graphical representations 305(1)-305(5) that the user is authorized to select. - Once the user has selected the “Cameras” category from the drop-
down menu 315, the user may select theregion 310 by, for example, dragging acursor 320 over theregion 310. The collaboration application determines that the user selection of the graphical representation of the cameras 305(2)-305(4) corresponds to a user selection of the security cameras located in the corresponding region of campus. In response, the collaboration application sends, to the cameras 305(2)-305(4), an invitation to join the collaboration meeting as peer members. This allows the user to, for example, view security footage from the selected cameras. It will be understood that similar use cases may exist for selecting drones, connected vehicles, sensors, etc. For example, the collaboration application may allow for real-time sensor monitoring to assess a situation at a site/location during an emergency collaboration meeting. In this example, sensors may be dynamically added or removed to assess different sites. - The user may alternatively select the region of
interest 310 before engaging the drop-down menu 315. In this example, the drop-down menu 315 may list only categories of devices that are currently located in the region ofinterest 310. For example, if there are no drones currently located in the region ofinterest 310, the drop-down menu 315 may not display “drones” as a category. Thus, the drop-down menu 315 may dynamically update based on the current geographical location of the network connected devices in the region ofinterest 310. The drop-down menu 315 may only display devices/categories of devices that a user is authorized to access. - As previously mentioned, a collaboration application (e.g., as described in connection with
FIGS. 1-3 ) may determine that a user selection of a graphical representation corresponds to a user selection of a network connected device. To make this determination, a collaboration application may maintain a mapping database that correlates the physical location and the graphical representation of the network connected device. Anexample mapping database 400 is shown inFIG. 4 . Themapping database 400 includes adevice field 405, anauthorization field 410, aphysical location field 415, a map coordinatesfield 420, ahostname field 425, an Internet Protocol (IP)address field 430, aninterface field 435, and aprotocol field 440. - The
device field 405 provides information relating to the type of network connected device. For example, “camera_parking” in Row A of themapping database 400 may refer to a security camera in the region of interest inFIG. 3 . Theauthorization field 410 provides information relating to the human(s) who are authorized to configure/use/invite the corresponding network connected device. For example, theauthorization field 410 in Row A may indicate that any member of the security team for building 23 is authorized to send an invitation to the camera. - The
physical location field 415 indicates the physical location of the corresponding network connected device. For example, the physical location of the camera isparking lot 1 ofbuilding 23. Thephysical location field 415 may dynamically update based on the current physical location of the network connected devices. This may involve the collaboration application determining a subsequent physical location of a network connected device in a geographic area. For instance, the physical location of a drone may change because drones are mobile network connected devices. Thus, if the mapping table 400 were to include information relating to a drone, thephysical location field 415 for the drone would update/change dynamically (e.g., periodically, in response to a stimulus/message/notification, etc.). Fixed devices (e.g., security camera 305(4)) may not change physical locations, and therefore thephysical location field 415 for such devices may not dynamically update. - The map coordinates
field 420 indicates the location of the graphical representation of a network connected device on the map. This location corresponds to the physical location of the network connected device. In the example of Row A, the map coordinates {Xi, Yj} may correspond toparking lot 1 of building 23 as graphically represented on the map ofFIG. 3 . The collaboration application may consult themapping database 400 to determine that the location of the graphical representation of the network device (e.g., the map coordinates field 420) is correlated with the network connected device (e.g., the device field 405). For example, if a user clicks on the graphical representation (or drags over an area that includes the graphical representation), the collaboration application may determine that the user selection of the graphical representation corresponds to a user selection of the network connected device. Thus, the collaboration application may query themapping database 405 for fixed devices (e.g., pre-built into the mapping database 405) and/or mobile devices. - Like the
physical location field 415, the map coordinatesfield 420 may dynamically update. In one example, the collaboration application dynamically updates themapping database 400 to correlate a network connected device with a subsequent physical location of the network connected device and a subsequent location of the graphical representation that corresponds to the subsequent physical location. For instance, if thephysical location field 415 for a drone changes/updates, the map coordinatesfield 420 for the drone may dynamically update accordingly. The collaboration application may dynamically update the map based on themapping database 400. This may permit the user to dynamically add or remove network connected devices from the collaboration meeting even if the network connected devices change location. - The
hostname field 425 indicates the host name of the corresponding network connected device. As illustrated by Row A, certain network connected devices may not have a specified host name. TheIP address field 430 indicates the IP address of the corresponding network connected device. For instance, the IP address of the camera is 10.10.10.1. Theinterface field 435 indicates the type of interface associated with the corresponding network connected device. Like thehostname field 425, certain network connected devices may not have a specified type of interface. Theprotocol field 440 indicates the particular protocol(s) by which the collaboration application communicates with the corresponding network connected device. In this example, the security camera uses the Constrained Application Protocol (CoAP) as described in Internet Engineering Task Force (IETF) Request for Comments (RFC) 7252. -
FIG. 5 is adiagrammatic illustration 500 of example communications between network connected devices 505(1)-505(3) andcollaboration application 110. In this example, thecollaboration application 110 communicates with the network connected devices 505(1)-505(3) via CoAP (e.g., CoAP observe). For example, the network connected devices 505(1)-505(3) may register with thecollaboration application 110 using CoAP before the start of the collaboration meeting. During meeting scheduling, the host/user may register network connected devices 505(1)-505(3), which in this example are required for the collaboration meeting. Thus, thecollaboration application 110 may determine the current physical location of a network connected device in a geographic area (e.g., via CoAP) before the meeting begins and construct themapping database 400 to include thephysical location field 415. Thecollaboration application 110 may present the map to a user based on themapping database 400. The map permits a user to, for example, invite specified network connected devices to a scheduled or ongoing collaboration meeting. The user may invite network connected devices based on geography and/or association (e.g., mobile phones, user-owned network connected devices, etc.). In one example, thecollaboration application 110 may send commands/notifications (e.g., commands with predefined instructions) to network connected devices 505(1)-505(3). The meeting participants may receive (e.g., via email) a control mechanism to override any such command or apply a new/different command. - When the collaboration meeting begins, the
collaboration application 110 may send a CoAP command/notification to network connected devices 505(1)-505(3) (e.g., telepresence units, connected lighting, sensors, etc.). The command/notification may prompt network connected devices 505(1)-505(3) to join the meeting automatically and perform predefined actions. For example, a security camera or drone may begin streaming video to the collaboration application when the collaboration meeting begins. - After the collaboration meeting begins, the
collaboration application 110 may enable manual or automatic dynamic addition or removal of one or more network connected devices to/from the collaboration meeting. A user may invite/control, e.g., connected lighting, thermostats, cameras, etc. For example, connected lighting may be invited to a collaboration meeting and set to a festive setting during a celebration. In addition, sub-groups of network connected devices may be specified. For example, a sub-group may include a drone and a particular display in a meeting room. In this example, the drone may dynamically stream video solely on the particular display dynamically, while other displays in the meeting room may be used for other purposes (e.g., a presentation of the meeting). In another example, data from multiple sensors in a sub-group may be presented side-by-side in real-time during a collaboration meeting. - The above example involving sensors enables the dynamic grouping of sensors based on physical location and the dynamic creation of different groups to achieve different results. These sensors, or groups of sensors, may be added to a collaboration meeting. Dynamic grouping may be based on, for example, site location, threshold bundles, faulty sensor bundles, or any other well-defined parameters. In a further example, the collaboration application maps the sensors as well as other network connected devices (e.g., devices on the Internet), so that two groups may be dynamically added during a meeting. For example, one group may bundle temperature sensors at a particular site and another group may include the heating, ventilation, and air conditioning (HVAC) system on the same site. With both groups joining the ongoing meeting, it is possible to assess the situation at the site and take corresponding action in parallel. Based on the sensor group monitoring the HVAC system group, the HVAC systems may be controlled to achieve a desired result. Feedback on this result may be provided immediately by the sensor bundle. It will be appreciated that sensor monitoring is just one example use case.
-
FIG. 6 is aflowchart 600 of an example method for dynamically adding a network connected device to a collaboration meeting. At 605, the collaboration meeting starts. At 610, it is determined whether a network connected device is to be dynamically added to the collaboration meeting. If not, the collaboration meeting proceeds without sending an invitation to the network connected device until the meeting has concluded at 615. If network connected device is to be dynamically added to the collaboration meeting, the flow proceeds to 620. - At 620, it is determined whether the network connected device is user owned. If the network connected device is not user owned, at 625 the user may select a device type (e.g., from a drop-down menu) and area/region on the map. At 630, the collaboration application selects the network connected device (e.g., using a mapping database). If the network connected device is user owned, at 635 the network connected device may be manually or automatically selected from a list of authorized devices. The flow may proceed to 630 where, as mentioned, the collaboration application selects the network connected device/registers the network connected device as selected. At 640, the collaboration application dynamically adds (e.g., sends an invitation to) a network connected device to the collaboration meeting. The flow proceeds to 615, where the collaboration meeting stops.
- While the method of
FIG. 6 is dynamic (i.e., applied during an ongoing meeting), it will be understood that similar techniques may apply before the collaboration meeting begins (e.g., during collaboration meeting scheduling). For example, a collaboration application may, at the start of the collaboration meeting, send invitations to network connected devices that a user registered/selected before the collaboration meeting started. -
FIG. 7 is aflowchart 700 of an example method for mapping network connected devices. At 705, the collaboration application acquires information (e.g., location information, authorization information, IP address, etc.) regarding the network connected devices. The collaboration application may communicate with the network connected devices using, for example, CoAP, H.325, Ethernet protocols, protocols for cloud based applications, etc. The collaboration application may also determine whether the device is to be added or removed from a collaboration meeting. - At 710, the collaboration application builds a mapping database based on the acquired information. The mapping database (e.g., mapping database 400) may include information of a well-defined area in a building floor that is mapped to the appropriate mapping coordinates. For example, a conference room may have a range in the map from {Xi, Yi} to {Xn, Yn}. The mapping table may include map coordinates {Xk, Yk} for a graphical representation of a network connected device that correspond to the physical location of the network connected device in the conference room. The mapping database may include manually input information (e.g., for certain fixed devices) and/or automatically discovered information (e.g., for certain mobile devices). For example, the mapping database may include manually input information for a security camera, as well as information automatically obtained from a connected vehicle (e.g., via a global positioning system).
- At 715, the collaboration application detects a new network connected device. For example, a mobile device, such as a drone, may have recently entered a region of interest. The collaboration application may tag the network connected device as unidentified and display a corresponding graphical representation on the map. The collaboration application may also/alternatively send a notification to a user/administrator about the new network connected device.
- At 720, it is determined whether the protocol (e.g., CoAP) used to communicate with the device provides the desired information (e.g., information to populate the mapping database). If the protocol is sufficiently intelligent to permit communication of the desired information (e.g., network connected device location and authorization information), the collaboration application may automatically update the mapping database (e.g., with updated map coordinates, physical location, etc.) at 725. External protocols may provide other information, such as IP address, interface, port, and other desired information. The mapping database may also maintain this other information.
- If the protocol does not provide the desired information, the flow proceeds to 730, where it is determined whether precise mapping is required. If precise mapping is not required, the user/administrator may manually provide general location information and authorization details at 735. For example, the user may manually input Conf_Room_1 as the physical location and all_employees as the authorization information. The collaboration application may automatically allocate the next available map coordinates from the predefined range corresponding to conference
room 1. Thus, the mapping database may be updated to include the desired information. External protocols may provide other information, such as IP address, interface, port, and other desired information. The mapping database may also maintain this other information. If precise mapping is required, the user/administrator may manually provide precise location information and authorization details at 740. For example, the user may manually provide a precise location and input admin_to_vp as the authorization information. -
FIG. 8 is a block diagram of acollaboration server 800 that is configured to implement techniques presented herein. In this example, thecollaboration server 800 includes amemory 805, one ormore processors 810, and anetwork interface 815. Thememory 805 includes mapping logic 820 (e.g.,mapping logic 120 ofFIG. 1 ) and a database 825 (e.g.,mapping database 400 ofFIG. 4 ). The one ormore processors 810 are configured to execute instructions stored in the memory 805 (e.g., mapping logic 820). When executed by the one ormore processors 810, themapping logic 820 enables thecollaboration server 800 to perform the operations associated with techniques described herein. - The
memory 805 may be read only memory (ROM), random access memory (RAM), magnetic disk storage media devices, optical storage media devices, flash memory devices, electrical, optical, or other physical/tangible memory storage devices. Thus, in general, thememory 805 may comprise one or more tangible (non-transitory) computer readable storage media (e.g., a memory device) encoded with software comprising computer executable instructions and when the software is executed (by the processor 810) it is operable to perform the operations described herein. -
FIG. 9 is aflowchart 900 of a generalized method in accordance with examples presented herein. At 910, a collaboration application presents a map of a geographic area that includes a network connected device, wherein the map includes a graphical representation of the network connected device, the graphical representation having a location on the map that corresponds to a current physical location of the network connected device in the geographic area. At 920, the collaboration application receives an indication of a user selection of the graphical representation. At 930, in response to receiving the indication, the collaboration application determines that the user selection of the graphical representation corresponds to a user selection of the network connected device. At 940, the collaboration application sends, to the network connected device, an invitation to join a collaboration meeting as a peer member of the collaboration meeting. - A collaboration application may invite physical network connected devices to a collaboration meeting on a per-meeting basis. This enables the creation of an accurate communication path between the collaboration application and the physical network connected devices. Physical network connected devices may be dynamically added or removed from a meeting instance across geographical locations, connected directly or logically (e.g., over the Internet). Different groups/sub-groups of network connected devices may be created for a meeting depending on the specific need. These techniques may enhance existing collaboration tools, and may provide ease of customer use for network connected devices (e.g., connected lighting, sensors, etc.) during a conference. This is a useful and convenient interface for communicating with network connected devices. During fun events, the collaboration application described herein may create a suitable ambiance (e.g., via connected lighting). A user no longer is required to be tethered to a network connected device during a collaboration meeting.
- In addition to dynamically adding individual network connected devices, the collaboration application may select multiple network connected devices without knowing the identities of the network connected devices. For example, in a security team meeting, the participants may assess any situation in a particular area despite not knowing the individual identities (e.g., IP address, etc.) of security cameras. Thus, the collaboration application does not require prior knowledge of the identity of the network connected device to be dynamically added to the collaboration session.
- The physical location of network connected devices may be an important aspect of the collaboration meeting. Logically mapping network connected devices by, e.g., map coordinate pixels, IP address, physical location, authorization, etc. allows network connected devices to be dynamically added based purely on physical location. This eliminates the need for any new hardware controller or new wiring connections. For example, by selecting smart ceiling lights in different physical locations, the collaboration application may bring those lights under the direct control of a user/participant in the collaboration meeting. Whereas most conventional systems involve a centralized controller to controls only local devices, the collaboration application connects network connected devices that are located at a different locations and that are not necessarily connected directly to the local central controller device.
- In addition, current systems cannot add devices to a meeting on an ownership basis. The techniques described herein allow for access based on “authorization”. The mapping database illustrates how an individual network connected device may join a meeting session based on authorization. During a virtual collaboration session, a participant can privately or publicly connect to a particular device and provide additional input during the meeting. For example, during a virtual meeting, participants may monitor their individually owned drones and provide additional input or make the streaming available to other participants. Another use case involves monitoring or controlling devices at home (e.g., personal network connected devices) from within the meeting interface. This allows a user to, for example, respond to an emergency situation from within the meeting.
- Also described is the dynamic grouping/sub-grouping of network connected devices based on a requirement during a collaboration session. The group/sub-group may be made part of an ongoing session. Different groups of devices may be formed based on physical location, authorized access, personal network connected devices, etc. to achieve different results by dynamically adding the network connected devices while the collaboration session is active. The mapping system provides a logical way of forming groups of similar or different devices. This may be achieved via the mapping database, e.g., if physical location is important. Alternatively, the network connected devices may be classified and bundled together dynamically based on other factors, such as a well-defined threshold (e.g., all sensors above a certain temperature threshold, connected cars travelling over a specified speed limit, etc.) or an event.
- The collaboration application controls network connected devices in the context of a meeting instance rather than in the context of a particular room. The graphical user interface described herein is software based and accessible to every participant via a network connected device (e.g., laptops, mobile phones, etc.) In addition, the collaboration application may populate different network connected devices based on which network connected devices the participants are authorized to access rather than fixed set of devices tied to the central controller. The collaboration application may also provide access to the personal network connected devices during the meeting. In addition, these techniques are consistent with certain advertisement schemes.
- As mentioned, the collaboration application logically maps physical network connected devices (e.g., devices close to or in use by a user). This enables, for example, mapping sensors to an ongoing meeting and monitoring real time values to assess a situation in the field. The collaboration application may also provide the capability to dynamically add or remove network connected devices from a collaboration meeting that is not necessarily connected directly to the collaboration application. In one example, the physical devices may be associated with virtual collaboration instances. Logical sub-groups of devices may be dynamically created for different collaboration instances. The devices may be controlled or monitored using any underlying communication protocol. For example, the collaboration application may have complete knowledge of smart ceiling lights that are joined to a particular meeting instance, which can be controlled from a meeting tool interface.
- Due to the complete logical mapping and knowledge of physical network connected devices (e.g., connected smart lighting, sensors, display screen, etc.), reserved rooms, etc., many variants of logical grouping may be created dynamically. Complete knowledge of the devices permits various sub-groups of the devices to be created and controlled even if the devices are physically separated.
- A user interface is also provided for on-boarding and selecting devices dynamically. Various mappings are possible to achieve different results. For example, any of the following features may be mapped to each other (e.g., in a mapping database): the network connected device; the user/owner; the physical location; the network interface; the meeting instance; etc. Different combinations may be mapped/utilized in a meeting to achieve different results. For example, device-to-device mapping may permit a drone to be dynamically mapped to specified display screens during a meeting so that the drone streams the video directly to the specified display screens.
- In one form, a method is provided. The method comprises: presenting a map of a geographic area that includes a network connected device, wherein the map includes a graphical representation of the network connected device, the graphical representation having a location on the map that corresponds to a current physical location of the network connected device in the geographic area; receiving an indication of a user selection of the graphical representation; in response to receiving the indication, determining that the user selection of the graphical representation corresponds to a user selection of the network connected device; and sending, to the network connected device, an invitation to join a collaboration meeting as a peer member of the collaboration meeting.
- In another form, an apparatus is provided. The apparatus comprises: a network interface configured to communicate with a network connected device; a memory; and one or more processors coupled to the memory, wherein the one or more processors are configured to: present a map of a geographic area that includes the network connected device, wherein the map includes a graphical representation of the network connected device, the graphical representation having a location on the map that corresponds to a current physical location of the network connected device in the geographic area; receive an indication of a user selection of the graphical representation; in response to receiving the indication, determine that the user selection of the graphical representation corresponds to a user selection of the network connected device; and send, to the network connected device, an invitation to join a collaboration meeting as a peer member of the collaboration meeting.
- In another form, one or more non-transitory computer readable storage media are provided. The non-transitory computer readable storage media are encoded with instructions that, when executed by a processor, cause the processor to: present a map of a geographic area that includes a network connected device, wherein the map includes a graphical representation of the network connected device, the graphical representation having a location on the map that corresponds to a current physical location of the network connected device in the geographic area; receive an indication of a user selection of the graphical representation; in response to receiving the indication, determine that the user selection of the graphical representation corresponds to a user selection of the network connected device; and send, to the network connected device, an invitation to join a collaboration meeting as a peer member of the collaboration meeting.
- The above description is intended by way of example only. Although the techniques are illustrated and described herein as embodied in one or more specific examples, it is nevertheless not intended to be limited to the details shown, since various modifications and structural changes may be made within the scope and range of equivalents of the claims.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/660,386 US20190036719A1 (en) | 2017-07-26 | 2017-07-26 | Connecting physical resources to virtual collaboration meeting |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/660,386 US20190036719A1 (en) | 2017-07-26 | 2017-07-26 | Connecting physical resources to virtual collaboration meeting |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190036719A1 true US20190036719A1 (en) | 2019-01-31 |
Family
ID=65138441
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/660,386 Abandoned US20190036719A1 (en) | 2017-07-26 | 2017-07-26 | Connecting physical resources to virtual collaboration meeting |
Country Status (1)
Country | Link |
---|---|
US (1) | US20190036719A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190379752A1 (en) * | 2015-10-05 | 2019-12-12 | Polycom, Inc. | System and method for collaborative telepresence amongst non-homogeneous endpoints |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100094548A1 (en) * | 2008-07-09 | 2010-04-15 | Tadman Frank P | Methods and systems of advanced real estate searching |
US20100216491A1 (en) * | 2009-02-20 | 2010-08-26 | David Winkler | Dynamic elements on a map within a mobile device, such as elements that facilitate communication between users |
US20120166972A1 (en) * | 2006-02-24 | 2012-06-28 | Yahoo! Inc. | Method and system for communicating with multiple users via a map over the internet |
US20140108943A1 (en) * | 2012-10-16 | 2014-04-17 | Korea Electronics Technology Institute | Method for browsing internet of things and apparatus using the same |
US20140148135A1 (en) * | 2005-04-04 | 2014-05-29 | X One, Inc. | Location Sharing And Tracking Using Mobile Phones Or Other Wireless Devices |
US20140241354A1 (en) * | 2013-02-25 | 2014-08-28 | Qualcomm Incorporated | Establishing groups of internet of things (iot) devices and enabling communication among the groups of iot devices |
US20140331144A1 (en) * | 2011-11-24 | 2014-11-06 | Jae Ho Kim | Method and system for providing nui |
US20150201022A1 (en) * | 2012-07-11 | 2015-07-16 | Korea Electronics Technology Institute | Method for providing internet of things service |
US20150347114A1 (en) * | 2014-05-28 | 2015-12-03 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling internet of things devices |
US20160105292A1 (en) * | 2014-10-13 | 2016-04-14 | Korea Advanced Institute Of Science And Technology | Method and System for Controlling Internet of Things (IoT) Device |
US20170126525A1 (en) * | 2015-11-02 | 2017-05-04 | Thington, Inc. | Systems and methods for controlling devices |
US20170308268A1 (en) * | 2016-04-21 | 2017-10-26 | Alpine Electronics, Inc. | Map presentation system and navigation system |
US20170323540A1 (en) * | 2016-05-09 | 2017-11-09 | Coban Technologies, Inc. | Systems, apparatuses and methods for triggering actions based on data capture and characterization |
US20180060153A1 (en) * | 2016-08-31 | 2018-03-01 | At&T Intellectual Property I, L.P. | Sensor Web for Internet of Things Sensor Devices |
-
2017
- 2017-07-26 US US15/660,386 patent/US20190036719A1/en not_active Abandoned
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140148135A1 (en) * | 2005-04-04 | 2014-05-29 | X One, Inc. | Location Sharing And Tracking Using Mobile Phones Or Other Wireless Devices |
US20120166972A1 (en) * | 2006-02-24 | 2012-06-28 | Yahoo! Inc. | Method and system for communicating with multiple users via a map over the internet |
US20100094548A1 (en) * | 2008-07-09 | 2010-04-15 | Tadman Frank P | Methods and systems of advanced real estate searching |
US20100216491A1 (en) * | 2009-02-20 | 2010-08-26 | David Winkler | Dynamic elements on a map within a mobile device, such as elements that facilitate communication between users |
US20140331144A1 (en) * | 2011-11-24 | 2014-11-06 | Jae Ho Kim | Method and system for providing nui |
US20150201022A1 (en) * | 2012-07-11 | 2015-07-16 | Korea Electronics Technology Institute | Method for providing internet of things service |
US20140108943A1 (en) * | 2012-10-16 | 2014-04-17 | Korea Electronics Technology Institute | Method for browsing internet of things and apparatus using the same |
US20140241354A1 (en) * | 2013-02-25 | 2014-08-28 | Qualcomm Incorporated | Establishing groups of internet of things (iot) devices and enabling communication among the groups of iot devices |
US20150347114A1 (en) * | 2014-05-28 | 2015-12-03 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling internet of things devices |
US20160105292A1 (en) * | 2014-10-13 | 2016-04-14 | Korea Advanced Institute Of Science And Technology | Method and System for Controlling Internet of Things (IoT) Device |
US20170126525A1 (en) * | 2015-11-02 | 2017-05-04 | Thington, Inc. | Systems and methods for controlling devices |
US20170308268A1 (en) * | 2016-04-21 | 2017-10-26 | Alpine Electronics, Inc. | Map presentation system and navigation system |
US20170323540A1 (en) * | 2016-05-09 | 2017-11-09 | Coban Technologies, Inc. | Systems, apparatuses and methods for triggering actions based on data capture and characterization |
US20180060153A1 (en) * | 2016-08-31 | 2018-03-01 | At&T Intellectual Property I, L.P. | Sensor Web for Internet of Things Sensor Devices |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190379752A1 (en) * | 2015-10-05 | 2019-12-12 | Polycom, Inc. | System and method for collaborative telepresence amongst non-homogeneous endpoints |
US10862987B2 (en) * | 2015-10-05 | 2020-12-08 | Polycom, Inc. | System and method for collaborative telepresence amongst non-homogeneous endpoints |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11336705B2 (en) | Displaying content from multiple devices | |
EP3926917B1 (en) | Data transmission method, device and computer storage medium | |
JP2024532326A (en) | Integrated workspace on a communication platform | |
CN111630495B (en) | Server support for multiple audio/video operating systems | |
CN108141395A (en) | Including the cloud domestic automation system and correlation technique synchronous with family message queue | |
US20130290475A1 (en) | Shared access to a remotely running application | |
KR20120118019A (en) | Web browser interface for spatial communication environments | |
CN103595759B (en) | Desktop presentation method based on high in the clouds | |
JP6517107B2 (en) | Peer-to-peer data delivery on the network | |
US10775761B2 (en) | Dynamic personalized room control panel and personal preference room setup | |
EP3215949A1 (en) | A new instant messaging (im) system | |
US12167167B1 (en) | Application-based control of devices within an environment | |
JP6335978B2 (en) | System and method for providing a virtual communication session for some participants on a communication session | |
KR102120548B1 (en) | CLOUD BASED IoT NETWORK VIRTUALIZATION SYSTEM AND NETWORKING METHOD THEREOF | |
CN102368780B (en) | An application-based information interaction method and system | |
CN111405229B (en) | Video conference processing method, system, client, electronic equipment and storage medium | |
US9703866B2 (en) | Music system managing method | |
CN111131753B (en) | Conference processing method and conference management platform server | |
US20190036719A1 (en) | Connecting physical resources to virtual collaboration meeting | |
US8963990B2 (en) | Transmission management system, transmission terminal, transmission system, method of managing data transmission, and recording medium storing data transmission management program | |
JP7504439B2 (en) | Online Dialogue System | |
WO2022091446A1 (en) | Network system and control device | |
US20170104763A1 (en) | Presentation device and presentation device coordination | |
Henze et al. | Fog Horizons--A Theoretical Concept to Enable Dynamic Fog Architectures | |
JP6311519B2 (en) | Authentication program, authentication method, and authentication apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CISCO TECHNOLOGY, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHUKLA, AMITESH;PARELLO, JOHN;SIGNING DATES FROM 20170725 TO 20170726;REEL/FRAME:043123/0446 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |