Cruisers Forum
 


Reply
  This discussion is proudly sponsored by:
Please support our sponsors and let them know you heard about their products on Cruisers Forums. Advertise Here
 
Thread Tools Search this Thread Rate Thread Display Modes
Old 27-07-2018, 21:24   #1
Registered User
 
CareKnot's Avatar

Join Date: Sep 2016
Location: Greater Houston Galveston Metroplex
Boat: 1979 Endeavor 32
Posts: 337
OCPN Poor man's radar and more...

The following is a rough idea to enhance the capabilities of OCPN. Criticism and contributions are welcome. After I lay out the basic idea, I will follow up with some applications that I believe might benefit the mariner and integrate with OCPN.

This idea attempts to utilizes the properties of light and common off-the-shelf components to produce a data product that can be utilized by OCPN and also by an independant viewer that can 'see' through fog and smoke and in low light situations. The intent is to use an Android smartphone (utilizing its Camera, Gyroscopes, Accelerometers, Magnetometers, GPS [and an array of radio sensors to enhance accuracy and resolution]) for the purpose of producing images and line-of-sight target vector data. The output data product represents both dynamic and static objects in the form of a series of Lat/Lon pairs over time suitable for chart and live-stream video overlays.

Not all recreational mariners have AIS, Radar or other instruments that would allow them to quickly determine the location of fixed objects or to track moving objects in poor visibility. But most do have smartphones and most smartphones have numerous means to generate and export data.

Laser rangefinders are an established product, but I believe it may be possible to achieve the same ends without using cohesive light sources, at least for this project. Shifting frequencies of light may be adequate to identify and measure reflections, at least within a range suitable to this purpose.

Standing at my helm, eye level is about 8' above the waterline. That gives me about 3.5 statute miles to the visible horizon. But with the proper optics, I can see an object that is 50' high from about 8.7 miles or 45,936 feet. At the speed of light (not adjusted for atmosphere or other media) the round trip from a light source to a target 8.7 miles away, and then to return, is approximately 0.000091872 seconds or 0.091872 milliseconds.

I do not know the resolution of a current smartphone compass when corrected by input from other sensors. In fact, I am just spitballing here. But I do know that as distance is reduced, accuracy is increased. So it seems conceivable that target range data can be generated with enough accuracy and within an acceptable time frame to generate position, course and speed data to, 1) overlay on a chartplotter, 2) identify objects as something to avoid (for an autohelm plugin perhaps), 3) overlay on the smartphone's camera display and, 4) transmit object vector data to a central server.

In this manner, not only could mariners get real time Notice to Mariner updates marked on their charts and track non-AIS enable ships, but it could serve to contribute to navigable cartography too.

Seeing through fog and some other suggested applications will be covered in following posts.
__________________
Kindest Regards,
Phillip
CareKnot is offline   Reply With Quote
Old 28-07-2018, 10:03   #2
Registered User
 
CareKnot's Avatar

Join Date: Sep 2016
Location: Greater Houston Galveston Metroplex
Boat: 1979 Endeavor 32
Posts: 337
Re: OCPN Poor man's radar and more...

Forgive me, but I have changed my mind. I want to dig into the basics before I move on.

The basic idea (very simplified) of a laser rangefinder is to bounce cohesive light off an object and measure how long it takes for the light to travel from an emitter to the object and return; or conversely, to split the beam and measure the angle where the two beams converge at the target. There are numerous science sites that cover this topic in detail and even offer DIY instructions on basic laser rangefinder projects. For the sake of clarity, a basic diagram looks like this:



The easiest way to reach the goal of putting range data on screen would be utilize a riflescope with a built-in laser rangefinder. However, extracting range data for further refinement and application would require an extensive hack the hardware, unless I am missing something. So it seems desirable to take the DIY approach for the rangefinder and develop an Android app to control the device and harvest the data. That is the current plan.

There are other methods that laser light to collect range data. In the 1940s a radio frequency altimeter was developed that bounced radio waves off the earth thousand of feet below. There are also sonic rangefinders whose range is much shorter. Camera focus mechanisms and automotive collision avoidance systems are good examples.

I found some interesting material at Stack Exchange that may prove useful. From what I have gathered in all cases, the inability to focus the emissions of radio and sonic frequencies limits their useful range in environments where there are multiple targets or interference. However, they can tell you the range to the closest object. So each option has an application and modern integrated circuits has made investigating these options seem worthwhile to me.

Now let's move on to imaging and consider how to see through fog.

If we project a light at the fog (cohesive light or otherwise), it will illuminate the water particles suspended in the air. The reflection is manifested as 'glare' that obscures downrange imagery.

Everyone is familiar with the glare of headlights in fog. But if you use low beams, the glare is reduced and visibility improves. Switch to fog lamps instead of headlamps and visibility improves even more. The light is placed further away from the direct line of observation and aimed along a near parallel path. The effect makes a profound difference in image visibility.



Now let's imagine we can eliminate glare almost completely. Would objects become noticeably clearer? Absolutely. Would objects at long distance be discernible? Yes. Given the proper optics, even identifiable. But let's assume for a moment that they aren't identifiable. How important is image resolution of distant objects?

All a mariner really needs to know about a distant object is 1) that it exists, 2) where it is, 3) what direction and how fast is it moving. Information like this is invaluable to a helmsman that's socked in and is 'flying blind', right? But at distance, all that is needed on the display is a visual marker and notifications related to possible lines of convergence.

So, how do we achieve seeing through fog? We turn off the light source and wait for a reflection that is further away from the light source than it is from the glare. It's really that simple. We catch the reflection off the objects downrange, but eliminate the nearby reflection off the fog. This might be hard to visualize because, in our frame of reference, light and it's reflection seem instantaneous. However, sound travels much more slowly than light, so let's use sound to help visualize the effect.

Imagine you are in a large canyon. You fire a pistol and wait to hear the echo. The time it takes to hear the echo determines the distance to the reflective surface; the canyon wall, right?

Ok, same scenario, but this time you fire up a loud engine. It continues to run, kind of like a spotlight that continues to shine. As long as that engine runs, you will not be able to hear the echo. The echo will be drowned out by the engine noise much like distant reflections would be drowned out by the glare from nearby reflections. However, if the engine stops running abruptly, you can measure how long it takes from when the engine stops to when the echo stops.

What we have done is moved from measuring the leading edge of a sound and the leading edge of its echo, to measuring the trailing edge of sound and it's echo. And we can do the same thing with light. We turn it off and wait for its reflection to end. One of the benefits of using this technique for imaging is that the backlight effect also helps to define object edges; objects that are located in the foreground. It's all a matter of timing.

You pulse the light source. After the light pulse ends, plus a predetermined delay, the device start recording image data. Each successive pixel altered past that point represents image data and the time of the occurrence in milliseconds represents range.

What I do not know at this time are the timing characteristics of a typical smartphone camera's ccd array and those of a laser that might prove suitable to this task. Perhaps some of the more technically gifted can offer some guidance.
__________________
Kindest Regards,
Phillip
CareKnot is offline   Reply With Quote
Old 28-07-2018, 10:52   #3
Registered User
 
transmitterdan's Avatar

Join Date: Oct 2011
Boat: Valiant 42
Posts: 6,008
Re: OCPN Poor man's radar and more...

I don’t understand what problem you are trying to solve.

Rangefinders don’t measure time delay directly. They measure the phase of the light frequency between source and reflection which is linear with time delay. A camera CCD is much too slow to measure speed of light events. Typical frame rates are in tens of milliseconds between frames.

Maybe instead of writing up the solution you could clearly state the problem first. Are you trying to see things relative to the boat (like radar) or trying to identify the position of things relative to a fixed coordinate system?
transmitterdan is offline   Reply With Quote
Old 29-07-2018, 09:21   #4
Registered User
 
CareKnot's Avatar

Join Date: Sep 2016
Location: Greater Houston Galveston Metroplex
Boat: 1979 Endeavor 32
Posts: 337
Re: OCPN Poor man's radar and more...

Hi Dan,
Thanks for responding. I'll answer you as best I can. You said:
Quote:
Originally Posted by transmitterdan View Post
I don’t understand what problem you are trying to solve.
Forgive me if I wasn't clear, Dan. I tend to ramble when I get inspired. If I could, I would heavily edit the preceding posts.

This isn't so much about finding a solution to a specific problem. It is about more fully realizing opportunities. My clumsy efforts to explain my intentions have obviously caused some confusion. Mea culpa.

Cost is an overarching concern for many recreational and cruising mariners. Navigation instruments are expensive. I am attempting to mitigate those costs by repurposing existing and widely available hardware such as a smartphone The goal is producing useful location data and interfacing that data with the OCPN platform, 'on the cheap'.

OCPN is an extraordinary navigational tool. It is a feature rich package and offers a flexible platform for developing an astonishing array of plugins. I simply want to be able to contribute to this effort, not by writing code, but by thinking outside of the box.

It seems to me that the sensing, measuring and processing capabilities of current smartphones should be exploited for the recreational and cruising mariner and the OCPN platform. So, the purpose of this exercise is fourfold;
  1. Uncover the most efficient means of utilizing readily available hardware,
  2. Expand on available capabilities with external devices where needed,
  3. Generate useful data streams for the OCPN platform and,
  4. Campaign to expand the platform itself to include multiple data contributors, a data storage platform, data processing to discrete products, and finally, automatic download of those products and some form of automatic updates.

The first scenario I am exploring is extrapolating a smartphone's data (time, position, orientation, direction of motion, speed, etc.) to other objects and vessels. The applications of the data collection effort might include collision avoidance, Notice to Mariner type information and navigable cartographic updates to name but a few, so...

On the subject of rangefinding and how it is used to extrapolate the device's location related data to line-of-sight objects and vessels:
There are a number of DIY projects that I am researching. Some may be worth pursuing. On the other hand, there are some laser rangefinders that boast a computer interface, but they seem cost prohibitive. So I am soliciting thoughtful contributions.

I hope this gives you a better idea of my intentions. This being a case of eating the proverbial elephant, it will have to be taken one bite at a time. Ask anything you like.
__________________
Kindest Regards,
Phillip
CareKnot is offline   Reply With Quote
Old 31-07-2018, 10:30   #5
Registered User
 
CareKnot's Avatar

Join Date: Sep 2016
Location: Greater Houston Galveston Metroplex
Boat: 1979 Endeavor 32
Posts: 337
Re: OCPN Poor man's radar and more...

I've been spending a bit of time here: https://forums.parallax.com/discussi...r-big-distance

There are some pretty good discussions on DIY stuff, some of which they have turned into commercial products.
__________________
Kindest Regards,
Phillip
CareKnot is offline   Reply With Quote
Reply

Tags
radar


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Need oCPN binary dependancies for Win 7 / vs2010 / wxLib 3.0.3 / oCPN 4.6.1 build... evendine OpenCPN 5 04-03-2018 05:05
OCPN 4.8.2 Status bar 0.6 : Preference kills OCPN! Didier B OpenCPN 15 12-02-2018 02:05
Poor Man's Watermaker ? Chrisc Plumbing Systems and Fixtures 39 08-03-2011 16:42
Poor Man's Genset (Is This a Crazy Thing To Do?) PatrickS Electrical: Batteries, Generators & Solar 6 27-09-2010 07:41
Poor Man's Tug markpj23 Seamanship & Boat Handling 5 20-06-2006 10:09

Advertise Here


All times are GMT -7. The time now is 09:39.


Google+
Powered by vBulletin® Version 3.8.8 Beta 1
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
Social Knowledge Networks
Powered by vBulletin® Version 3.8.8 Beta 1
Copyright ©2000 - 2024, vBulletin Solutions, Inc.

ShowCase vBulletin Plugins by Drive Thru Online, Inc.