A compact camera built using an optical mouse

(petapixel.com)

217 points | by PaulHoule 3 days ago ago

41 comments

  • MarkusWandel 3 days ago ago

    I always say "on a scale from no canoe to a $5K canoe, even the crappiest canoe is 80% of the way there". This camera illustrates that for vision. When you hear about those visual implants that give you, say, 16x16 grayscale you think that's nothing. Yet 30x30 grayscale as seen in this video, especially with live updates and not just a still frame is... vision. Not 80% of the way there, but does punch way above its weight class in terms of usefulness.

    • SwtCyber 6 hours ago ago

      The moment you add motion and temporal continuity, even a postage-stamp image turns into something your brain can work with

      • MarkusWandel 4 hours ago ago

        The brain really is quite a machine. I've personally had a retinal tear lasered. It's well within my peripheral vision, and the lasering of course did more damage (but prevents it from spreading). How much of this can I see? Nothing! My peripheral vision appears continuous. Probably I'd miss a motion event only visible to that eye only in that particular zone. Not to mention the enormous number of "floaters" one gets especially by my age (58). Sometimes you see them but for the most part the brain just filters them out.

        Where this becomes relevant is when you consider depixellation. True blur can't be undone, but pixellation without appropriate antialiasing filtering...

        https://www.youtube.com/watch?v=acKYYwcxpGk

        So if your 30x30 camera has sharp square pixels with no antialiasing filter in front of the sensor, I'll bet the brain would soon learn to "run that depixellation algorithm" and just by natural motion of the camera, learn to recognize finer detail. Of course that still means training the brain to recognize 900 electrodes, which is beyond the current state of the art (but 16x16 pixels aren't and the same principle can apply there).

        • jacquesm 2 hours ago ago

          It would be interesting to see how far you could push that. I bet just two scanlines side-by-side would be enough for complete imaging. Maybe even just one, but that would require a lot more pre-processing and much finer control over the angle of movement. Congrats on the positive outcome of that surgery, that must have been pretty scary.

    • lillecarl 13 hours ago ago

      Diminishing returns explained through canoes :)

      16x16 sounds really shit for me who still has perfect vision indeed but I bet it's life changing to be able to identify presence / absence of stuff around you and such! Yay for technology!

      • ACCount37 11 hours ago ago

        This kind of thing is really held back by BCI tech.

        By now, we have smartphones with camera systems that beat human eyes, and SoCs powerful enough to perform whatever image processing you want them to, in real time.

        But our best neural interfaces have the throughput close to that of a dial-up modem, and questionable longevity. Other technological blockers advanced in leaps and bounds, but SOTA on BCI today is not that far away from 20 years ago. Because medicine is where innovation goes to die.

        It's why I'm excited for the new generation of BCIs like Neuralink. For now, they're mostly replicating the old capabilities, but with better fundamentals. But once the fundamentals - interface longevity, ease of installation, ease of adaptation - are there? We might actually get more capable, more scalable BCIs.

        • SiempreViernes 10 hours ago ago

          > Because medicine is where "move fast and break things" means people immediately die.

          Fixed the typo for you.

          • ACCount37 9 hours ago ago

            Not moving fast and not breaking things means people die slowly and excruciatingly. Because the solutions for their medical issues were not developed in time.

            Inaction has a price, you know.

            • omnicognate 8 hours ago ago

              It has a price for the person with the condition. For the person developing the cure it does not (except perhaps opportunity cost, money not made that could have been), whereas killing their patients can have an extremely high one.

            • jama211 3 hours ago ago

              You’re starting to sound terrifyingly like an unethical scientist. We know how that ends, we’ve been down that road before, and we know why it is a terrible idea.

        • arcanemachiner 11 hours ago ago

          To anyone wondering:

          BCI == Brain-computer interface

          https://en.wikipedia.org/wiki/Brain–computer_interface

          • Lapsa 11 hours ago ago

            mind reading technology has already arrived. radiomyography & neural networks deciphering EEGs

            • ACCount37 10 hours ago ago

              Not really. Non-invasive interfaces don't have the resolution. Can't make an omelet without cracking open a few skulls.

              • Lapsa 7 hours ago ago

                they do read my mind at least to some extent -> "The paper concludes that it is possible to detect changes in the thickness and the properties of the muscle solely by evaluating the reflection coefficient of an antenna structure." https://ieeexplore.ieee.org/document/6711930

      • metalman 11 hours ago ago

        it is a good ilustration of something like moores law, for a comming end point where a hand held device will have more than enough cabability and capacity to do ANYTHING, a meer mortal will EVER require, and the death of options and features, and a return to personal autonomy and responsibility

        AI is the final failure of "intermitent" wipers,which like my latest car, is irevocably enabled to smeer the road grime and imperceptable "rain" into a goo, blocking by ability to see

        • makeitdouble 9 hours ago ago

          True. Then we cross a threshold where things that weren't even thought as possible become reachable, and we're back on the treadmill.

          That's what we're having with VR: we came to a point where increasing DPI for laptop or phone seemed to make no sense; but that was also the point where VR started to be reachable, and over there a 300DPI screen is crude and we'd really want 3x that pixel density.

        • immibis 10 hours ago ago

          use the washer button to spray the windshield with water and help the goo wipe off

          • metalman 5 hours ago ago

            yes, obviously, but my point is that I am now tasked with helping the "feature" limp along, whenever it lurches, unexpectedly, into action, therby ADDING distraction which if you read the ancient myths and legends is one of the main methods that evil spirits and deamons undermine and defeat the unwary....and lull them into becoming possesed, hosts, for said entities.

            who's working for who here anyway?

            already?

  • mdtrooper 10 hours ago ago

    These kind of news are for me the real news for this website instead of a new fancy tech product of Apple or similar corporation.

    Sincerely a lot of thanks.

    • bbeonx 14 minutes ago ago

      agreed...i think it's fine to keep up with what the corporate world is doing, but these projects bring me real joy

    • SwtCyber 6 hours ago ago

      Corporate launches are predictable and polished; projects like this are the opposite

  • anotherpaul 13 hours ago ago
    • zamadatix 8 hours ago ago

      One of the comments from the creator answered one of my questions https://www.reddit.com/r/3Dprinting/comments/1olyzn6/comment...:

      > Do you mean the refresh rate should be higher? There's two things limiting that: > - The sensor isn't optimized for actually reading out images, normally it just does internal processing and spits out motion data (which is at high speed). You can only read images at about 90Hz > - Writing to the screen is slow because it doesn't support super high clock speeds. Drawing a 3x scale image (90x90 pixels) plus reading from the sensor, I can get about 20Hz, and a 1x scale image (30x30 pixels) I can get 50Hz.

      I figured there would be limitations around the second, but I was hoping the former wasn't such a big limit.

  • consumer451 8 hours ago ago
  • kachapopopow an hour ago ago

    Waiting until someone builds a high speed camera using mouse sensors.

  • gsf_emergency_6 12 hours ago ago

    Compressed sensing! What Terence Tao uses to sell math funding!!

    https://www.youtube.com/watch?v=EE9AETSoPHw&t=44

    https://www.instructables.com/Single-Pixel-Camera-Using-an-L...

    (Okay not the same guy, but I wanted to share this somewhat related "extreme" camera project)

    • fph 10 hours ago ago

      Is this compressed sensing though? The description says "Sensor 30x30 pixels, 64 colors (ADNS-3090 if you wanna look it up)", so definitely not a single-pixel camera.

      • gsf_emergency_6 8 hours ago ago

        Sorry to get you confused. This is a different setup. TFA uses the 30x30 and no compressed sensing. The link above uses a single photo detector. They also use an LED matrix, but that's to make the _image_ (I think)

    • HPsquared 9 hours ago ago

      I wonder how much quality you could get out of that sensor.

  • shit_game 8 hours ago ago

    Very cool project. I love the detail the poster went into in their linked video post about working with the sensor and their implementation.

    > Optical computer mice work by detecting movement with a photoelectric cell (or sensor) and a light. The light is emitted downward, striking a desk or mousepad, and then reflecting to the sensor. The sensor has a lens to help direct the reflected light, enabling the mouse to convert precise physical movement into an input for the computer’s on-screen cursor. The way the reflected changes in response to movement is translated into cursor movement values.

    I can't tell if this grammatical error is a result of nonchalant editing and a lack of proofreading or a person touching-up LLM content.

    > It’s a clever solution for a fundamental computer problem: how to control the cursor. For most computer users, that’s fine, and they can happily use their mouse and go about their day. But when Dycus came across a PCB from an old optical mouse, which they had saved because they knew it was possible to read images from an optical mouse sensor, the itch to build a mouse-based camera was too much to ignore.

    Ah, it's an LLM. Dogshit grifter article. Honestly, the HN link should be changed to the reddit post.

    • foxglacier 32 minutes ago ago

      LLM or not doesn't matter as much as it's just bad reader-hostile writing with a dump of trivial details while also glossing over the relevant part (how does a mouse detect movement).

  • jacquesm 2 hours ago ago

    These are 'optical flow' chips. They are quite interesting for many reasons.

  • supportengineer 4 hours ago ago

    This is fantastic. What an amazing project! There is a certain segment of photography enthusiasts who love the aesthetic.

  • JKCalhoun 7 hours ago ago

    "I made a camera from an optical mouse. 30x30 pixels in 64 glorious shades of gray!"

    I wonder why so many shades of grey? Fancy!

    (Yeah, the U.K. spelling of "grey" looks more "gray" to these American eyes.)

    Hilarious too that this article is on Petapixel. (Centipixel?)

  • jan_Sate 9 hours ago ago

    Impressive. That's what I read HN for!

  • foxglacier 12 hours ago ago

    I have to say, the Game Boy camera doesn't have only 4 colors. It has an analog output you can connect to your own ADC with more bits and get more shades of gray. I even managed to get color pictures out of it by swapping color filters and combining the images.

  • eugene3306 8 hours ago ago

    Just in case the author is here: what's the FPS?

    • madars 5 hours ago ago

      On Reddit, author says "The preview is shown at 20fps for a 3x scale image (90x90 pixels) and 50fps for a 1x scale image. This is due to the time it takes to read the image data from the sensor (~10ms) and the max write speed of the display.", and adds that optical mice motion tracking goes to 6400 fps for this sensor but you can't actually transmit image at that rate.

      https://old.reddit.com/r/electronics/comments/1olyu7r/i_made...

  • SwtCyber 6 hours ago ago

    What I love most is that it takes something we all interact with every day

  • ck2 2 hours ago ago

    vaguely related but exponentially more impressive

    camera the size of a grain of rice with 320x320 resolution

    https://ams-osram.com/products/sensor-solutions/cmos-image-s...