Beginning at NYU in Jan 2013 within the context of a Patents Translation course delivered online, this blog seeks to uncover the patents that rock our daily lives....
If you like watching birds, and you would like to meet the guests up close that come to your bird feeder, then the Birdbuddy is definitely the smart bird feeder for you!
Winner of the Innovation Award, in the Artificial Intelligence category, at the SXSW™ Conference, held in Austin, TX, March 10 - 19, 2023, the Birdbuddy,with app, is a smart birdfeeder. The Birdbuddy is smart because the feeder not only takes beautiful, high-resolution, pictures and videos of your bird guests, the Birdbuddy is also designed to identify approximately 1000 species that are likely to drop in and feed. Vistor identification that comes with bonus information about each species. The Birdbuddy also sends you notifications when there is a guest feeding, and when the feeder sensor has detected that the bird seed level is low. The Birdbuddy even comes with an optional solar roof accessory for powering the camera, which is otherwise powered via battery or USB cable.
Below, one of the owner-uploaded YouTube videos of first and second Birdbuddy visitors.
The Birdbuddy app collects data on all Birdbuddy guests. Thus, for example, on March 12, 2023, the Birdbuddy statistics had already recorded a sum total of 42,590,015 photos taken, 502 spotted species and 52,605 active feeders. Bird guest information, which is collected in a database of bird migrations and populations, and then forwarded to experts to help them better understand and protect birds. The data is also visually processed on maps showing sightings in North America, in real-time, for all species combined, or per species, such as for the Northern Cardinal, or the Tufted Titmouse.
Masterminded by a bird-loving group of developers and engineers, hailing from Slovenia, the Birdbuddy was kickstarted at Kickstarter, where it is available for pre-order, as well as on the Birdbuddy website.
According to Pavegen Systems, it's not only patents on the soles of your shoes, it’s POWER! Power to light up the world, thousands of footsteps at a time, every day!
Pavegen Systems is a British company that manufactures pavement tiles designed to harvest the kinetic energy of pedestrian footsteps and/or vehicle traffic, in view of transforming this energy into electricity used for street lamps and buildings. Technically, the invention consists in capturing kinetic energy in the form of linear motion and translating it to rotational movement suitable for driving the rotor of an electric generator.
Beyond generating about 2.1 watts per hour of energy in high pedestrian traffic areas (or 20 seconds of light with each footstep), the Pavegen Systems tiles also become part of the Internet of Things (IoT), creating data for use in marketing, for transportation and in municipalities. “It’s knowing where people are” according to one of the inventors, Laurence Kemball-Cook, CEO and Founder of Pavegen Systems. It is also knowing what people do, when for example the tiles installed in a fitness center are connected to mobile devices, in view of providing various sorts of health fitness feedback to users, considering the number of their steps.
In the future, Pavegen Systems intend to produce tiles to pave roads and harvest the kinetic energy of vehicle traffic. But this is apparently more difficult, considering the tremendous traction forces of heavyweight traffic, and the resistance of the materials required to pave roads.
This real and sparkling clean, green technology, designed for the 21st-century urban center, was awarded the SXSW™ 2017 Interactive Innovation Award in the Smart Cities category.
This invention was disclosed in the following patent family:
The abstract of this inventionis included below as well as a patent figure drawing of the Pavegen systems motion converter.
The present application describes techniques for the harvesting of kinetic energy from the movement of people and/or vehicles. A motion converter is discussed which converts linear progression caused by traffic-related impulse forces, to be converted to rotational motion for driving the rotor of an electricity generator. An assembly for harvesting energy including the motion converter and a floor unit are also described
Have you ever painted in space... without paper.... that is, in 3D VR (Virtual Reality) "using a room as your canvas”?
If you have not and have entertained the thought of jumping into VR with paint brushes, then you are in for a thrilling treat with Tilt Brush -- an award-winning application, which most recently garnered the SXSW™ 2017 Interactive Innovation Award in the VR and AR category! Painting inside 3D VR software (wearing a 3D headset and manipulating the handheld controller palette and brush ) is exactly how you might paint with Tilt Brush, and even record and playback your art piece. Using an extraordinary array of paint brush effects such as dry and wet ink, patterns, metal, lights, animated, volumetric, refraction and music reactive strokes, you can paint, draw, sculpt, animate and re-invent the possibilities --out of 2D Flatland paper, and into 3D space!
See the Tilt Brush launch video!
Tilt Brush was originally developed by Skillman and Hackett, a company that develops rapid prototyping and VR applications. The company was then purchased in April 2016 by Google Inc, and is now part of the Google VR portfolio.
This invention is disclosed in a family of 2 patents, using a dress form as the anchoring object and fashion design as a possible domain of application:
US20160370971 - Dress form for three-dimensional drawing inside virtual reality environment
WO2017048685 - Generation of three-dimensional fashion objects by drawing inside a virtual reality environment
The abstract of this invention is included below with the Figure 3 patent drawing.
Systems and methods are described for
producing a representation of a display of a three-dimensional virtual reality
environment and defining a dress form object within the virtual reality
environment. The virtual reality environment is configured to receive
interactive commands from at least one input device coupled to a computing
device and associated with a user. Fabric movement simulations are generated by
animating the dress form object according to configured animation data and
displayed in the virtual reality environment. The display being may be
generated in response to receiving a movement pattern indicating movement of
the dress form object. (Abstract WO2017048685 and US20160370971)
Tilt Brush is available on HTC VIVE and OCULUS Rift VR platform headsets.
Can you imagine a camera following you in mid-air and shooting a perfect video of you bungee jumping off a bridge, moguling down a black diamond hill, river rafting in turbulent water, or hang gliding like a bird?
Ok, but that's not all!.... When you have touched the ground, or you are at the bottom of the mogul hill, can you then imagine the camera gently landing on the palm of your hand like a falcon?
Imagine no mo’…. This is exactly what the throw and shoot, Lily Robotics, drone camera does!
Just throw it in the air like a ball and it will start to hover, and follow you around, taking videos. Then, when you are done, the camera gracefully lands on your hand, and not on the ground.
WO2015179797 titled Unmanned aerial copter for photography and/or videography discloses this invention, which essentially consists of two components: an operator device and an Unmanned Aerial Vehicle (UAV) copter.
The UAV copter is equipped with a portable power source, one or more image capturing devices, such as video camera(s), able to take videos. The UAV copter is further equipped with sensors, able to determine the elevation of both devices for the purposes of maintaining a preset difference in elevation, using the thrust power of the UAV copter propellers.
The UAV copter is also able to recognize the location of the operator (or subject) wearing the operator device, and of tracking the subject in a photo frame using its one or more cameras pointed at the operator / subject. Thus, for the purpose of recognizing and tracking the operator, the operator device is equipped with ‘a barometer, a magnetometer, an accelerometer, a gyroscope, a GPS module, or any combination thereof.” [0010]
Compared to the prior art of UAV video cameras, Lily resolves a number of issues. First, the operator is no longer a third person, alienated from the video scene, who manually operates the UAV via remote control. Since the Lily camera can automatically detect elevation and difference in elevation, and then automatically direct its camera at the operator, independently from a photographer/operator using a remote control for operating the UAV, the operator can also be subject of the video. That is, the operator can also be the skier moguling, the bungee jumper or the bride/groom at the wedding. This is an aspect of the invention called “third person camera perspective”. The shooting is from the perspective of the camera rather than from a “first person” photographer perspective.
Another aspect of the invention compared to the prior art concerns the means for calculating and setting elevation relative to the operator device, which contains a GPS or barometer or other positioning systems. Thus, rather than calculating elevation based on the ground, the Lily UAV copter can calculate elevation in mid air, and this is what enables video shooting of a hang glider, for example. This is also what enables the UAV copter to adjust its position after it has been thrown up in the air.
Still another aspect of the Lily camera concerns the battery swapping pack. The UAV copter includes a backup battery, within the UAV battery compartment, to prevent all the parameters from resetting during battery pack swaps.
The Abstract for WO2015179797 titled Unmanned aerial copter for photography and/or videography,corresponding to the Lily UAV copter camera invention is included below with a figure drawing of the UAV copter, extracted from the patent. A short video is also included so you can see Lily in action.
Some embodiments include an unmanned aerial vehicle (UAV) copter for consumer photography or videography. The UAV copter can determine a first elevation of the UAV copter and a second elevation of an operator device. The UAV copter can adjust the first elevation by controlling thrust power of one or more propeller drivers to maintain a preset elevation difference between the first elevation and the second elevation. The UAV copter can locate a target subject relative to the UAV copter. The UAV copter can adjust at least one of the propeller drivers to point a first camera of the UAV copter at the operator device. In some embodiments, in response to detecting that the UAV copter has been thrown, the UAV copter can provide power adjustments for propeller drivers of the UAV copter to have the UAV copter reach a predetermined elevation above an operator device. [AbstractWO2015179797]
Naturally, this invention won an SXSW™ 2016 Innovation award in the Sci-fi no longer category!
QUELL®,
winner of an Innovation Award in the Wearable Tech category at SXSW® 2016, is
wearable patented pain-relief technology that is 100% drug-free and non
addictive. [Quell® (1)]
Developed
by NeuroMetrix, Inc, Quell® is neurotechnology worn as an upper calf band, just below
the knee. Quell® is Bluetooth™-enabled so that it can upload information and
be controlled via a portable iOS or Android phone app.
The
principle of the Quell® neurotechnology is to provide an electrode that stimulates
sensory nerves according to a patented Optitherapy™ algorithmn of pulse waveform
(intensity, duration and shape) and pulse pattern specification (frequency and
duration of session), considering user feedback and within clinically specified
parameters.
Neurostimulation of sensory nerve fibers is thus delivered knowing
that sensory nerves carry pulses to the brain, which in turn have the capacity
to trigger the release of endogenous opioid-like peptides called enkephalins.
The release of endogenous opioid-like enkephalins is a natural response with an
analgesic effect that has the capacity to centrally block pain signals in the
body. Additionally, the release of enkephalins targets δ-opioid receptors which are different from
the μ-opioid receptors, targeted
by opioid pain medication, thus creating a synergistic effect. [Quell® (2); Ghelardini et. al, 2015]
The
Quell® technology called WINS (Wearable Intensive Nerve Stimulation) is FDA-approved
for day and night treatment of chronic pain. It is recommended as an adjunct to
medication for all sorts of chronic pain (e.g.; lower back and leg pain, diabetic
pain and fibromyalgia). 67 % of users reported a reduction in the use of pain
medication, 81 % in the reduction of chronic pain [Quell® (3)].
As
the device works during the night, tracking sleep patterns too, it is also
designed to break the cycle of pain-disrupted sleep, which amplifies pain and
reduces activity during waking hours.
The various functions and aspects of the NeuroMetrix Inc. neurotechnology inventions are disclosed in at least
70 patents, embodied in three additional devices:
1.
The wearable SENSUS® pain management system, developed more specifically for neuropathic
pain, resulting for example from diabetes; 2.
The hand-held device called NC-Stat®- DPN-Check®
for measuring and quantifying peripheral neuropathy, and 3.
The Advance® NCS (Nerve Conduction Study) System, designed for studying and measuring
nerve conduction in any setting.
US2015148865 titled Apparatus
and method for relieving pain using transcutaneous electrical nerve stimulation
recites those apsects of the Quell® invention
pertaining to the automated adjustment of the intensity of the neurostimulation
and its intensification during a therapy session to avoid habituation.
The Quell® invention
disclosed thus resolves many of the prior art problems in the field of Transcutaneous
Electrical Nerve Stimulation (TENS), in particular as related to
portability of the devices, wiring, user training and user support to regulate the
right amount of stimulation, duration of the sessions and the issues of
habituation.
The Abstact of the invention is included below, as well as a patent
drawing of the electrode band. An image of the marketed Quell® device is also
included above.
Apparatus for transcutaneous
electrical nerve stimulation in humans, the apparatus comprising: a housing; stimulation
means mounted within the housing for electrically stimulating nerves; an
electrode array releasably mounted to the housing and connectable to the
stimulation means, the electrode array comprising a plurality of electrodes for
electrical stimulation of nerves; control means mounted to the housing and
electrically connected to the stimulation means for controlling at least one
characteristic of the stimulation means; monitoring means mounted to the
housing and electrically connected to the stimulation means for monitoring at
least one characteristic of the stimulation means; user interface means mounted
to the housing and electrically connected to the control means for controlling
the stimulation means; display means mounted to the housing and electrically
connected to the control means and the monitoring means for displaying the
status of the stimulations means; and a strap attached to the housing; wherein
the strap is configured to hold the housing, stimulation means and electrode
array at a specific anatomical location to treat pain. [US2015148865]
The University of Texas Cockrell School of Engineering (Sharma, 2014) won the 2015 SXSW Interactive Innovations Award, in the “Sci-fi No Longer” category, with a non-invasive skin cancer detection device. This is a pen looking optical probe that uses light spectroscopy in three different modes to interrogate skin tissue. It is non-invasive as it requires no biopsy for testing and diagnosis of skin lesions.
The three spectroscopic technologies of this device, termed MMS -multimodal spectroscopy, are Raman spectroscopy (RS), diffuse reflectance spectroscopy (DRS) and laser-induced fluorescence spectroscopy (LIFS). Together these technologies, and their different modes of emitting light, are designed to provide complementary sorts of micro-environmental and biochemical information about skin tissue, for a far improved and faster diagnosis, compared to traditional macro-visual biopsy-based detection. For example, an interrogation of skin tissue using the multimodal spectroscopy (MMS) probe takes about 4.5 seconds (compared to several days for biopsy results). US2012057145 (A1) titled Systems and methods for diagnosis of epithelial lesions is the patent application corresponding to this device and its algorithms. In general, this invention addresses problems of skin cancer diagnosis and the current and most common methods of diagnosis involving tissue biopsy. Beyond the discomfort, cosmetics, expenses and turnaround time for biopsy results, this most common method of diagnosis invokes inherently qualitative methods of macro-visual clinical examination, that is, physician experience in visually identifying which lesions are biopsied. In turn, critical reliance on physician experience enters the equation, because there are documented differences in the accuracy with which lesions are detected among general practitioners and dermatologists [US2012057145], compounded by issues of access to specialized dermatology care, whether due to costs, geographic location or scarcity, and the burden of unnecessary biopsy. Finally, the accuracy of this inventive diagnostic method is also intended to resolve issues of safety margins for the perimeter of excisions when surgery is required.
Skin lesion micro-environments and bio-chemical properties are interrogated via spectroscopy, that is: 1. emitting a light source into a skin tissue using an optic fiber, 2. collecting the light re-emitted from the skin tissue with a second optic fiber, and 3. generating spectra for the light re-emitted from the skin tissue in terms of diagnostically relevant parameters such as intrinsic fluorescence and absorption, reduced scattering coefficients and Raman scattering, using a spectrophotometer. Then, the spectral information is matched with known properties using a specifically generated look-up table algorithm. It is known for example that the fluorescence of certain endogenous fluorophores such as collagen changes in the presence disease, and that the scattering and absorption properties of light will be affected by morphological changes of the tissue. The analysis of the light emitted back from the skin tissue thus yields micro-information about the properties of the skin tissue, such as collagen structure, nuclear morphology, blood fraction, oxygen saturation and a tissue scattering coefficient. all which swiftly informs diagnosis. The development of this invention device and its method - that is, of the hardware and software, as well as the associated research were funded by the Centers for Disease Control. The abstract of this multimodal spectroscopy invention, recited in US2012057145 (A1) titled Systems and methods for diagnosis of epithelial lesions, is included below, as well as a an image of the front view of the probe surface in contact with skin, showing Raman spectroscopy delivery (red) and collection (blue) fibers, and the diffuse reflectance spectroscopy delivery (yellow) and collection (green) fibers (Sharma, 2014).
"Systems comprising an optical fiber switch connected to a light source and an optical fiber probe, the optical fiber probe comprising a first optical fiber connected to the optical fiber switch and a second optical fiber connected to a spectrophotometer. Methods for determining one or more tissue parameters comprising: emitting light from a first optical fiber into a tissue; collecting the light reemitted from the tissue with a second optical fiber; generating a spectra of the light reemitted from the tissue with a spectrophotometer; and utilizing a look-up table based algorithm to determine one or more tissue parameters, wherein the lookup-table based algorithm comprises the steps of: generating a look-up table by measuring the functional form of a reflectance measured by the spectrophotometer using one or more calibration standards with known optical properties; and implementing an iterative fitting routine based on the lookup-table." Abstract US2012057145 (A1)
Work appears underway to attempt to correct the effects of skin pigmentation on spectroscopic methods of skin cancer detection (e.g.; Soyemi et. al., 2005; Bersha 2010), including the issue of the depth of skin tissue interrogation (Tseng et. al. (2008). If skin cancer is more prevalent in fair skin individuals, it is nonetheless more fatal in darker skin individuals due to late diagnosis and misinformation about epidemiological data (The Skin Cancer Foudnation, 2009).
In any event, this award-winning, non-invasive diagnostic technology is indeed... no longer sci-fi...
SXSW™ (a film, music and tech event) is now in Austin, TX.
In 2015, there are 145 films scheduled for screening at SXSW™ film, March 13 to 21!
And if SXSW™ music 2015 tops SXSW™ music 2014, there will be upwards of 28,000 music professionals attending the event, with more than 2300 performances in more than 100 venues, scattered around the Austin bars, clubs, parks, churches and elsewhere, hailing from more than 57 countries for 6 days of festival fun, March 17 to 22!
Add to SXSW ™film and SXSW™ music, SXSW™ Interactive March 13-17, and you also had 5 days of cutting edge creativity, innovation, and inspiration, across new interactive media technologies.
The SXSW™ Interactive Conference includes a five-day program of keynotes, panels and presentations, awards and showcasing of some of the most far out talent and inventions. Among the awards, modeled Wimbledon-style with rounds, finalists and alternatives: the SXSW™ Interactive Awards, the SXSW™ Dewey Winburne Community Service Awards (for do-gooders with tech), the SXSW™ Venture Accelerator Awards (to help uncover amazing new start-ups), the SXSW™ Gaming Awards and the Released at SXSW™ Awards.
Twenty-eight years after its inception in 1987, SXSW™ has become the single most economically profitable event in Austin, fetching 218 million USD in 2013. And SWSX™ has also spread to other cities, inspiring other “four letter” venues at the intersection of technology and art (e.g. XOXO in Portland, OR; South by Due East in Houston, TX.; YXYY (yes and yes, yes) in Palm Springs, CA, plus others. [Wikipedia]
-----
This year, on March 13, among an audience of innovators at SXSW™, a special swearing-in ceremony was even held for Michelle K. Lee, the new Director of the USPTO and Under-Secretary of Commerce and Intellectual Property, since Jan 2013!
----
More about some of the 2015 line-up at SXSW™ coming!