Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
IT

Why Webcams Aren't Good Enough (reincubate.com) 118

Jeff Carlson, writes in a post: After consulting numerous webcam buying guides and reviews, purchasing a handful of the most popular models, and testing them in varying lighting situations, I can't escape the grim truth: there are no good webcams. Even webcams recommended by reputable outlets produce poor quality imagery -- a significant failing, given it's the one job they're supposed to provide. Uneven color. Blown highlights. Smudgy detail, especially in low light. Any affordable webcam (even at the high end of affordability, $100+), uses inadequate and typically years-old hardware backed by mediocre software that literally makes you look bad. You might not notice this if you're using video software that makes your own image small, but it will be obvious to other people on the call. [...] Why are webcams like this?

[...] Two main factors currently hinder serious webcam innovations, one a technical limitation and one a business shortcoming. As with all photography, the way to create better images is to capture more light, and the method of capturing more light is to use larger image sensors and larger lenses. That's why a consumer DSLR or mirrorless camera produces much better images than a webcam. Primarily this is about size: webcams are designed as small devices that need to fit onto existing monitors or laptop lids, so they use small camera modules with tiny image sensors. These modules have been good enough for years, generating accolades, so there's little incentive to change. The StreamCam appears to have a better camera and sensor, with an aperture of f/2.0; aperture isn't listed for the other cameras.

Contrast this technology with the iPhone, which also includes small camera modules by necessity to fit them into a phone form factor. Apple includes better components, but just as important, incorporates dedicated hardware and software solely to the task of creating images. When you're taking a photo or video with an iOS device, it's processing the raw data and outputting an edited version of the scene. Originally, Logitech's higher-end webcams, such as the C920, also included dedicated MPEG processing hardware to decode the video signal, but removed it at some point. The company justified the change because of the power of modern computers, stating, "there is no longer a need for in-camera encoding in today's computers," but that just shifts the processing burden to the computer's CPU, which must decode raw video instead of an optimized stream. It's equally likely Logitech made the change to reduce component costs and no longer pay to license the H.264 codec from MPEG LA, the group that owns MPEG patents. That brings us to the other factor keeping webcam innovation restrained: manufacturers aren't as invested in what has been a low margin business catering to a relatively small niche of customers.

This discussion has been archived. No new comments can be posted.

Why Webcams Aren't Good Enough

Comments Filter:
  • by lessSockMorePuppet ( 6778792 ) on Tuesday February 02, 2021 @03:36PM (#61020418) Homepage

    Because they don't relay all your audio and video to the authorities, like a certain other brand of spyware device.

  • use a real camera (Score:5, Insightful)

    by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Tuesday February 02, 2021 @03:38PM (#61020430) Homepage Journal

    Major manufacturers have released software to let you use most of their "real" digital cameras as webcams. You can even do it under Linux with gphoto2. So just use a real camera like the "pros" (of YouTube) do. You can get a supported used model that will do 1080p30 for around $200, and 1080p60 won't cost you much more.

    • Re:use a real camera (Score:5, Informative)

      by dgatwood ( 11270 ) on Tuesday February 02, 2021 @03:45PM (#61020472) Homepage Journal

      Major manufacturers have released software to let you use most of their "real" digital cameras as webcams. You can even do it under Linux with gphoto2. So just use a real camera like the "pros" (of YouTube) do. You can get a supported used model that will do 1080p30 for around $200, and 1080p60 won't cost you much more.

      The "pros" use a real camera, but they usually record using the camera's internal codec or using an external HDMI input dongle so that they get actual high quality video. The Canon webcam feature, for example, provides barely-above-PAL-SD resolution (1024x576).

      • The Canon webcam feature, for example, provides barely-above-PAL-SD resolution (1024x576).

        That's true (probably... I didn't measure it but it's somewhere below 720p), but it's more then enough to get most of the benefit and look an order of magnitude better than any webcam thanks to the properly exposed image, natural bokeh, and a flattering focal length.

        And that's Canon, others might offer higher resolutions, and of course there's always the HDMI capture option which isn't a big leap if you're already setting up a mirrorles or DSLR for your conf calls.

        • by Potor ( 658520 )

          That's true (probably... I didn't measure it but it's somewhere below 720p), but it's more then enough to get most of the benefit and look an order of magnitude better than any webcam thanks to the properly exposed image, natural bokeh, and a flattering focal length.

          I use a Canon 6D with the Canon Webcam Utility (beta) and the difference between that set up and my webcam is night and day, for all the reasons you state. Plus: I can set the white balance however I want; I can play with the shutter speed, aperture, and ISO to get the exposure I want; and I can swap lenses to get whatever look I want. Plus, the software is free, and I've had my camera for years, so there's no additional cost.

          No brainer.

        • by K10W ( 1705114 )
          Possibly depends on camera as my R outputs at 720p (@ 30p 10MBs) via the canon webcam software and agree there is a point to do that still as it looks fine for webcam style use vs the webcam I have around somewhere and the better noise and low light performance, out the camera colour, control over exposure etc, better DR and so on means it is still not bad idea for just the odd call to family where video is needed (oft just audio is good enough for me and I do have significantly better than average setup fo
      • by goombah99 ( 560566 ) on Tuesday February 02, 2021 @05:36PM (#61020982)

        Bigger lens does not mean better image. Infact the reverse is true. The larger the lens the more spherical aberrations so they require more fixes. Small lenses are easy to make with low abberations, and larger lensese are more expensive to do so.
        The reason we use large lenses is to gather more light allowing a faster shutter speed, or in low light, even being able to get a decent amount of light at all at the slowest tolerable shutter speed.
        But once a lens it large enough to collect the required light, which web cams are since they work at all, making them bigger doesn't get a better image.
        Moreover, the article says the problem is uniformity and color and that again has nothing to do with the lens size.
        The next thing they claim is that offloading raw-to-stream conversion is bad. Why? if the CPU can do it fast enough that's 100% what's required. It can't get any better once the CPU is fast enough. So there's no better image quality to be had by processing it on the camera.

        So this whole article seems flawed. At best if they wanted to make some sense they might have said that it's not a high enough frame rate or the images are blurry due to motion during the exposure. Those would be solvable by bigger lenses and specialized processing. But that's not the argument being made.

        • by Cid Highwind ( 9258 ) on Tuesday February 02, 2021 @06:48PM (#61021312) Homepage

          But once a lens it large enough to collect the required light, which web cams are since they work at all, making them bigger doesn't get a better image.

          Camera sensors operate below the "digital stuff works or it doesn't" level of abstraction. Bigger lens = more photons per pixel = more dynamic range in the analog stages = more information and less noise for the digital stages to work with = better picture.

          • Bigger lens means shallower depth of field. Unless you have a lens with adjustable focus, and you adjust that focus as needed, that bigger lens you want is going to be giving you fuzzy images more often than not.
            • by Cederic ( 9623 )

              You make this sound like an issue.

              When on a video conference I am able to keep my head within a perfectly reasonable distance range from the camera, making even manual focus a viable option.

            • Which is why even some cheap $40 webcams have autofocus - to cope with the larger lenses required for low-light conditions.

            • We're talking about higher end cameras here, they have focus.

              I believe that with a Canon with magic lantern installed you can autofocus with video if you use a remote (the wired one I bought to use with my original digital rebel works with my T2i) and hold the trigger down halfway. The newest cameras might even have a mode to do it automatically.

        • It's not a bigger lens that's needed and that the article is arguing for, but a bigger sensor. A bigger sensor has more space to capture light, making it more sensitive to detail and less sensitive to noise. The smaller the sensor, the more likely it is to suffer image noise and reduced detail, regardless of how many pixels it is supposed to generate.

          • by K10W ( 1705114 )

            It's not a bigger lens that's needed and that the article is arguing for, but a bigger sensor. A bigger sensor has more space to capture light, making it more sensitive to detail and less sensitive to noise. The smaller the sensor, the more likely it is to suffer image noise and reduced detail, regardless of how many pixels it is supposed to generate.

            and bigger sensors cost significantly more. Not like doubling sensor size simply doubles cost as it doesn't scale linear and often cost a lot more and can be 4 times the price for twice the surface area. Then comes all the other factors related to increasing the sensor such as increase in optics so glass isn't your significantly weak link. For webcams it is simply good enough for their main purpose and this whole thing is a none issue. One of the roles of webcams utilised in the correct way is low cost, ie

        • The next thing they claim is that offloading raw-to-stream conversion is bad. Why? if the CPU can do it fast enough that's 100% what's required.

          It's far dumber than that. Nearly every modern PC has either one or two hardware CODECs already. Intel Quicksync Video or AMD's Video Core Next (or VCE for older AMD chips), or for those people with dedicated GPUs they have NVENC or VCN on top of their existing Quicksync.

          There is literally dedicated hardware already available for this purpose in the PC.

          • by dgatwood ( 11270 )

            The main problem I'd expect from doing too much in software is the overhead involved in moving that much bulk data. For 1080p at 60fps and three bytes per pixel, you're moving 3Gbps over USB. That's seriously CPU-intensive — likely an order of magnitude more intensive than pulling 20 Mbps MPEG data over the wire and using a hardware CODEC in the computer to decode it.

        • by AcquaCow ( 56720 )

          If you want to increase signal to noise ratio, you need more photosites, meaning you need a bigger sensor. If you get a bigger sensor, you're going to need a larger lens. The larger lens also lets in more light which gets you faster shutter speeds, and a crisper image. An added bonus of larger sensor and larger lens, is that you can get shallower depth https://it.slashdot.org/story/... [slashdot.org] field to isolate the subject on the camera.

      • Fair enough. I see you can get 1080p30 to USB2 UVC for $9, though a cable does cost another five bucks or so.

    • by cayenne8 ( 626475 ) on Tuesday February 02, 2021 @03:52PM (#61020508) Homepage Journal
      One thing he briefly hit on, but is VERY important in the whole set up...is to spend time on proper LIGHTING!!

      If you can properly light yourself, your scene..that will help your picture a great deal.

      Make sure to color temperature match your lights...and set them up.

      3-point lighting isn't hard, and it is a tried and true set up that works for everyone.

      They also didn't mentioned too much on audio....with any video, you'll find even if your picture isn't the best quality, if your audio is quality, people will still watch you.

      The reverse isn't true...you could have a pristine 8K image, but if your audio is crackly...wind noisy, etc...people will shut you off almost immediately..spend time setting up a decent mike arrangement.

      Of course a better camera and lens will help, but till you can afford that, work on your lighting first, and that will help ANY camera system you settle on.

      Professional productions spend a lot of time on lighting...for a reason.

      YouTube is filled with lighting HowTo's.

      • by Cederic ( 9623 )

        I just have a nice angle lamp that bounces light off two walls and the ceiling. The result is soft lighting that still looks fine, and doesn't require me to set up multiple lights in a small room.

        If it's light outside there's also light from the window, but that isn't really needed.

    • by AmiMoJo ( 196126 )

      You shouldn't have to spend that much though. The relatively cheap cameras on phones do a much better job, and so do their microphones.

      So given they only need the camera and mic and not the whole phone, it shouldn't be that hard to make a webcam that produces the same quality using your PC for processing....

      But it is, because the way USB webcams work there isn't really any way to use the PC for processing, except via drivers that are annoying for the user and often janky. They are not cheap to make it maint

    • > So just use a real camera like the "pros" (of YouTube) do.

      Tried that, and my camera overheated.

      However any camcorder that has a HDMI output can be used. Although technically not quite as high quality as modern pro cameras are, they can all zoom a decent distance so you can set it up five feet away, and the longer focal length gives a much more pleasing picture that wide view webcams. They can be bought on ebay for $30 or so.

      Given that camcorders are designed to work for hours at a time, overheating is

  • by jellomizer ( 103300 ) on Tuesday February 02, 2021 @03:39PM (#61020434)

    Web Cams are designed to be cheap. And why not, being that most video will be compressed and send over a network under 100mbs.
    These are not production video cameras, that cost hundreds or thousands of dollars, they are about $10 worth of electronics, good enough to give an OK representation of you over low bandwidth.

    • by GoRK ( 10018 )

      I'd buy your statement if webcams were sold for the $5 that is commiserate with their quality. Otherwise please explain why you can get a no-name $40 camcorder and $9 HDMI->USB capture dongle from ebay and achieve fantastically better quality than a $400 webcam. How much the picture is compressed is irrelevant -- garbage in, garbage out as they say.

    • Cheap is what drives webcams. The Logitech QuickCam 4000 Pro is a famous example. I bought that webcam in 2004 or 2005. It was by far the best webcam I had tried, although just 640x480 the picture was crystal clear with no noise even at low light. I thought it was well worth the money (something like $100 then). When it was stolen probably a year later I did not blink twice and reordered it. When I tried it on, I was instantly dismayed. It was just another run-of-the-mill webcam with the classic noisy outpu

  • by fish_in_the_c ( 577259 ) on Tuesday February 02, 2021 @03:41PM (#61020450)

    If there was sufficient demand for something else. It would be made. Most people have no use for high end video equipment , talking to grandma just doesn't justify the expense.

    • by BAReFO0t ( 6240524 ) on Tuesday February 02, 2021 @03:51PM (#61020502)

      You mean like there is sufficient demand for phones with a large battery? Or for a phone that isn't deliberatly literally made of glass? Or demand for affordable high quality products?

      Sorry the industery knows it can strong-arm you all the way to their bank account.

      • "You mean like there is sufficient demand for phones with a large battery? Or for a phone that isn't deliberatly literally made of glass? Or demand for affordable high quality products?"

        YES. The five or ten remaining tech-literate consumers are a niche market.
        Bubba and LaQueefa don't genuinely care about anything you identified. Sorry. The average human is not much smarter than our primate cousins. Never mistake grumbling for determination to pay for what people SAY they want.

      • Don't be ignorant. Phones have a large battery, but people have demanded phones that are faster and have better screens and more capability. Same with the glass, there's phones on the market that make you look rich and break as soon as you look at them. If that's you, then go for it. There are phones on the market you can happily thrown through basket ball hoops, if that's want you want fucking buy that.

        Don't pretend the manufacturers are pushing something that buyers aren't demanding. The market is open. E

      • I think it's more accurate to say that the mainstream are perfectly happy with what options they have. It's the minority, like you and me, that are being strong-armed by default.

        I miss when computers were tools, not toys. There's plenty of affordable quality products -- just not ones the Slashdot community want.

        • by tlhIngan ( 30335 )

          I think it's more accurate to say that the mainstream are perfectly happy with what options they have. It's the minority, like you and me, that are being strong-armed by default.

          I miss when computers were tools, not toys. There's plenty of affordable quality products -- just not ones the Slashdot community want.

          No, if you want it, people do make it. There are phones with gigantic batteries available. There are phones built to be rugged. Problem is, they sell so few of them that they cost a lot more - you ca

    • I've found video conferencing outside a business context to be more of a novelty. It really didn't add much more than "god, you've gotten old" to the conversation.

    • by Anonymous Coward

      If there was sufficient demand for something else. It would be made.

      Maybe actual devices need to exist before manufacturers can have sales data that shows not people want them.
      I've looked for good webcams, they still don't exist. Instead of webcams, people are using clunky solutions like HDMI decoders to decode a DSLR's live output. Another way people get decent quality video is by installing an webcam app on a smartphone and sending the video feed to a PC over wifi.

    • As has already been pointed out, you can record with a DSLR. And there is some *very* nice dedicated video conferencing equipment out there. Those aren't "webcams." But they are really impressive.
    • by Pimpy ( 143938 )

      People like you used to argue there was no demand for providing more than 1 MP camera in a phone either, or that 640k of RAM should be enough for anybody. Manufacturers are bound by economies of scale, at a certain point it no longer makes financial sense to produce low-grade parts and they move on. The steady growth in CCD MP ratings YoY is a good example of this. It's not like they're making any more money, and the margins haven't drastically changed, they'd just do even worse by sourcing outdated compone

    • also... a low quality camera might even be an advantage when video chatting with grandma
  • Tip: I use my smartphone as a webcam. I connect it through USB to the PC. There are apps available that will detect your phone as a webcam. 4k webcam ready. Connect more than one phone and switch between camera viewpoints to spice up a webconf. One aimed at a whiteboard, the other on your face... .
  • All cameras do poorly with poor light. If you want good picture quality, the easiest fix is typically to improve the light.

  • logitech dropped mpeg compression? I'm not sure, at least you need some kind of compression if you want to have more than 640x480 at 30fps, because the bandwitdh of usb 2.0 it's quite limited.

    And I have a cheap usb chinese 1080 camera that supports mpeg streaming.

  • by BAReFO0t ( 6240524 ) on Tuesday February 02, 2021 @03:47PM (#61020486)

    Stick your phone to the top of your screen, mount your phone's file system to your PC's, and pipe your phone camera's /dev/ file into your chat tool's video input.

    Tadaa! Top quality webcam.

    Oh wait! We can't!
    Because Apple and the media mafia ruined it!

    • by gantzm ( 212617 ) on Tuesday February 02, 2021 @05:46PM (#61021026)
      Use this! [manycam.com]
    • It's really hacky, but this script I developed [aweirdimagination.net] does that by piping the phone's camera through a WebRTC connection made by using the mobile web browser, which connects to the desktop using gstreamer. It's pretty fiddly and doesn't actually work very well in practice, but it's at least a proof-of-concept that it's possible, without even needing to install an app on your phone past the web browser it already has.
    • by croftd ( 7659904 )
      r2do app (android only) can use most existing devices to capture images on a scheduled interval
    • by xsuchy ( 963813 )

      I use DroidCam. https://www.dev47apps.com/ [dev47apps.com] It turns any phone into a webcam. I have Android, but it should work with iPhone too. The free version is capable of SD, paid version can do HD resolution. Over Wifi or USB cable.

      Phone cameras are very good now.
      I just bought a stand for a phone for $20 and I have a setup with very fast autofocus, wide lens, and can accommodate almost any light.

  • Global Shutter 720p (Score:5, Informative)

    by Kisai ( 213879 ) on Tuesday February 02, 2021 @04:03PM (#61020568)

    The problem is that most webcams are cheapest of the cheap.

    - No global shutter, and the best "global shutter" cameras you can get top out at 720p unless you're willing to spend several thousands of dollars on a studio camera.

    The higher the resolution the more apparent the rolling shutter effect (jello/warping when movement happens) is. When you take still pictures, that's usually solved with software in the camera to take multiple exposures as part of taking a "HDR" image. Video doesn't have that option.

    - No dedicated hardware ASIC's

    As pointed out , but also a key reason why webcams are nowhere near the quality of a DSLR. The parts in webcams are the same parts in computers and low-end phones. You could pretty much guarantee that any USB webcam is rubbish if it doesn't have a USB-C cable on it, because to capture 4K video, you need the bandwidth of 3840x2160x4x60 bytes (2GB) per second, which would require a 15Gbit connection. Or three times USB 3.0. Existing cameras cheat, they either cut the frame rate, or they use 4:2:0 YUV colorspace instead of RGB. For 4:2:0 you instead multiply by 3/2 instead of 4 (which is 8-bit RGB) which gives you 6Gbits, or just a little too much for USB 3.0 (which is 5Gbits).

    720p, the rubbish cameras in everything only requires 1280x720x4x60 bytes (222MB or 1.8/Gbit, fitting under USB3.0, but not 2.0) at most , and is more likely 4:2:0 and uses 83MB/sec (664Mbits), still too high for USB2.0. Which now I'll point out that most 720p cameras built into laptops are actually USB 2.0 devices, and usually 720p30 or 480p60 only.

    So why isn't there compression invoked? Why are we using crappy rolling shutter cameras?
    Latency mainly. Global shutter requires the entire frame to be drawn rather than one line at a time, so if you compress it, and take more than 8ms you'll induce enough latency to be noticed by the user who is using it for real time communication. Since people in front of webcams mostly only move their mouths, you don't notice the jello effect unless they are very animated (eg a child), or the camera is used to monitor something like a street or doorbell.

    Basically webcams are garbage, because they are meant to be garbage. If you want high quality video, you buy a DSLR, with large sensors and multi-thousand-dollar lenses not an iPhone. What you get in camera phones, at least on the front cameras are "good enough to take cat videos" cameras, that still do extremely poorly in low-light and no-light situations. An iPhone can be used to make a "lost-footage" video with it's built in flash/light, and that's roughly what you get. If you tried to film a city at night with an iPhone or any cameraphone for that matter, it would be mostly black/orange and blurry.

    • by Tester ( 591 )

      So why isn't there compression invoked?

      But there is compression. Almost all webcams on the market actually output JPEG images for anything but the lowest resolution/framerate.. That said, JPEG, even though it's an antique codec isn't the problem. The problem is the sensor & ISP that they have that is very very cheap..

    • "Existing cameras cheat, they either cut the frame rate, or they use 4:2:0 YUV colorspace instead of RGB. For 4:2:0 you instead multiply by 3/2 instead of 4 (which is 8-bit RGB) which gives you 6Gbits"

      It's not really cheating since the sensor usually has a Bayer layout: every pixel is either R, G, or B. So, the image processing first multiplies the number of bits by 3 by interpolating the missing RGB components, and then eliminates half of the bits in the conversion to YUV420.

    • My $50 HP webcam freezes hummingbird wings. Who needs a mechanical shutter?
  • You never know which laptops will have good webcams. I am using a a Dell Precision 7750 which has 64GB of ECC memory and a high-end Xeon - but the webcam looks like something from the 1990s. It claims 720p, but it is the noisiest camera I've ever seen. I normally wouldn't care, but I do a lot of face to face meetings and it's kinda embarrassing.

  • If OP's benchmark is a $100 webcam, that is the problem and misleading. I have a Logitech Brio, a $300 CAD 4K USB 3.0 webcam attached to my Dell laptop. The webcam built into the laptop is worse than the camera built into any of my smartphones from the last 10 years.

    My Brio is amazing. Colour reproduction is amazing. I have a backlit office, and the camera can easily see me with stellar colour reproduction. As I'm typing this, my face has about 110lux of light on it, and has a 1200 lux backlight from my of

    • I use the Brio as well for work and love it but, in all fairness, it is the only decent quality webcam available in the market today.

      There's no comparable product by any other company - and the price reflects this.

      • by Venner ( 59051 )

        +1 for the Brio. I've had one a couple of years and it's solid, and the only real package option I've seen. It was also sold out for months at the start of the pandemic.

        I have an Azure Kinect cam coming as part of a project that I'm hoping I can use in place of the Brio, so I can replace my parents awful webcam, but the Microsoft cam isn't exactly budget either. I may end up just building a desk mount for my mirrorless camera; as someone above mentioned, many of the camera manufacturers have enable webcam-m

    • But that $99CAD webcam will have no better image quality than the back camera on a $99CAD Android phone with all its radios, battery, accelerometer, GPS, 2x cameras, and screen. Seems like someone could do better.

  • Logitech's higher-end webcams, such as the C920, also included dedicated MPEG processing hardware to decode the video signal, but removed it at some point. The company justified the change because of the power of modern computers, [...] but that just shifts the processing burden to the computer's CPU, which must decode raw video instead of an optimized stream. It's equally likely Logitech made the change to reduce component costs and no longer pay to license the H.264 codec from MPEG LA, the group that own

  • by MobyDisk ( 75490 ) on Tuesday February 02, 2021 @04:13PM (#61020628) Homepage

    The article has it backwards regarding Logitech removing the h.264 encoding from the camera. The reason old cameras compressed the stream was because old USB 1 devices didn't have enough bandwidth for a raw video stream. It was not to save CPU cycles. Actually - it made CPU usage worse. Rarely would software use the compressed h.264 from the camera feed. Usually it would have to decompress it to display it. And most of the time it will decode the stream, apply some effects (removing a background, overlaying it over the video game, adjusting the audio, etc.) then re-encode it whatever format is appropriate for the stream or for disk. Sending the raw uncompressed feed to the processor is better since it removes the need for the software to decode the stream, and the raw video is a better quality.

    • by Tester ( 591 )

      In practice, no one used the H.264 stream from the camera. Everyone uses the JPEG output. The H.264 stream was embedded in a weird format inside the JPEG comments... This was a spec created by Logitech+Microsoft+Skype, and as far as I know, only the old version of Skype and GStreamer could ever use it. And since the old Skype has been replaced, no one is actually using those! So they only used the JPEG content, which is more than good enough. The H.264 was never about the USB bandwidth, it was always about

  • Step up to a camera designed for your requirements.
  • Reading cayenne8's comments about the importance of lighting - and having seen so many of TV's "Talking Heads" being interviewed from home with bright desk and table lamps in the background, I have to wonder about the risk of sensor burn-in.

    I know that pretty much all modern DSLR's and mirrorless cameras have video record capability and that most can run for a good length of time [usually until the camera gets too hot], but in most cases the subject of the recording is going to be moving, or the camera w
  • by Java Pimp ( 98454 ) on Tuesday February 02, 2021 @04:29PM (#61020704) Homepage

    Web cams used to come with software that let you adjust the quality. Color, saturation, brightness/contrast, white balance... It was never the best quality but at least you had some level of control. Now they try to be smart and do it all for you and if you don't like it... oh well...

    • by v1 ( 525388 )

      I still have two original Apple iSight cameras. They were quite expensive, but were amazing for the time. Someone wrote software to fine-tune them, and they were so good you could use them for night vision. Had full access to white balance, exposure, and all the goodies. Interface was firewire400. Too bad time has left them behind, they were only like 480x320 resolution. The mic was stereo and very high sample rate. Had a physical shutter, when you rotated the end you could see the white petals fold

    • Maybe of help for Linux users wanting to fine tune webcam settings, there is a nice tool that comes with v4l-utils called qv4l2. Depending on what the camera supports you can fine tune things like exposure.
      My crappy old Logitech webcam has a horrible frame rate if the automatic exposure is enabled with indoor lighting, but setting it to manual and reducing the exposure makes it usable.

  • by QuietLagoon ( 813062 ) on Tuesday February 02, 2021 @04:30PM (#61020706)
    ... a thinly disguised ad for Apple's iPhone.
  • by SB5407 ( 4372273 ) on Tuesday February 02, 2021 @04:36PM (#61020732)
    Just a head's up: the submitter/OP-- Jeff Carlson--works for a company that sells software to make iPhones into webcams, so there's a possible conflict of interest in regards to reviewing webcams, you know?

    The software: https://reincubate.com/camo/ [reincubate.com]

    About OP, a Reincubate employee: https://reincubate.com/about/ [reincubate.com]
    • by jeffcarlson ( 7688268 ) on Tuesday February 02, 2021 @04:58PM (#61020812)
      Two quick clarifications, for transparency: 1. I (Jeff Carlson) wasn't the submitter of the post; that was "msmash". 2. I was hired by Reincubate to write the article (my third for them), but I'm not an employee. I'm a freelance writer that covers technology and photography. The topic came about during discussions with the Reincubate folks about webcams and image quality in general. So, although the conclusions favor the company's product, the research I did stands on its own. You can see more about me at jeffcarlson.com.
      • by SB5407 ( 4372273 )
        My understanding of reading your name under the section titled "Our Experts" on Reincubate's page titled "About our company" (https://reincubate.com/about/) had understandably led me to believe that you were an employee of Reincubate. Thank you for the response.
        • Oh yeah, I forgot that was there. Cheers.
        • Hey, politeness has no place on social media! We want some snarls here, or at the very least entertainment.

          That goes for you, too, Jeff Carlson. Stop being so nice. :)

  • I work at Facebook. They encouraged us to use a Facebook Portal for meetings. Its video quality is superb, even in low light. The webcam built into my Macbook is decent. I'd read good reviews of the Microsoft LifeCam, and bought their "LifeCam Studio for Business", and its quality is awful.

  • I don't really want to spend a lot of time or energy on a "quality" webcam.

    I'm not posting videos on rubetube or twatter, or camwhoring.

    You really don't want enough resolution to see my wrinkles/pores.

  • A lot of people are either using what is in their laptop, or whatever was cheap and available. The Laptop camera is at a terrible angle. Even if it had a good camera, it would still look like garbage.

    There are good cameras, yes, they cost more, they should be above eye-level, but not too much. The same for the microphone, notice that real broadcasters put the microphones up high.

    It isn't that hard to put out a good picture during meetings. That has the added impact of making you seem more authoritativ
    • The microphone up high is to get good sound without the microphone being in the frame. It's easier to get good sound if you don't mind the microphone being visible, as it is in most podcasts, where you're likely to see a Blue Yeti sitting right in front of the podcaster, or a Shure MV7 or a large diaphragm condenser mic on a visible boom. Or if the podcast takes live calls you might see them wearing a headset.
  • Excuses! (Score:4, Insightful)

    by BrendaEM ( 871664 ) on Tuesday February 02, 2021 @04:52PM (#61020796) Homepage
    We have USB 3. We have VP8. Fix it!
  • Seriously, who cares? It's a webcam.

    My company literally gave everyone in the company a webcam to use in 2012 or so, to help make Webex more personable or some such thing. It was a fad for a week or two, then we defaulted to not using webcams for webex again. Some folks are starting to use webcams a bit more often because a lot of us are working from home, but for the most part, seeing each other doesn't contribute to the data we're looking at being presented.

    And if we do see each other, it's in a little fl

    • Camgirls, live streamers, and the like

      The only reason that I see that a larger aperture would be important is if it means that you don't need a good lighting rig, as I hate it when friends FaceTime me when I'm getting ready to go to sleep, and I have to turn on some lights so they can actually see me. (a friend 3 timezones away does it so I can distract her son while she cooks dinner)

      But overall, my take on the article was basically that "webcams" went the same ways as modems -- they stripped down the hard

  • I have a logitech c920 that I use on my desktop and in place of my built in laptop webcam. Its an $80 1080p webcam and does surprisingly good video and decent stereo audio for something that is already a few years old. I find it sufficient for everything but I might not recommend trying to produce professional quality videos to monetize on the likes of youtube. It does alright in low light situation as well.

    Way cheaper than an iphone and it doesn't monopolize your phone while on a videocall.

    • This. I have one hanging on my primary monitor, and the audio/video quality is outstanding. It has been replaced, and a colleague has one of the newer version, and it is also outstanding.

      I suspect that most of those that people are complaining about are just really cheap hardware.
  • One of the big issues I see is the compression and streaming. The webcam typically sends an already-compressed stream over USB. This stream is typically decompressed by a software application then re-compressed to send over the wire. This recompression (decompress, recompress) typically happens in software, using electron, and is coupled with screen recording and viewing other streams from the internet. Most computers only have 1 hardware encoder and 1 hardware decoder, so even if they used it, there's stil

  • Cheap solution (Score:5, Interesting)

    by Joe2020 ( 6760092 ) on Tuesday February 02, 2021 @06:18PM (#61021178)

    I went with a cheap industrial 3M-pixel 12-bit HDR camera and a cheap colour reference card to get passed this dilemma.

    One can get such cameras now for less than $80 and colour reference cards for $15. Such cameras map their 12-bit input down to 8-bit and cover a 16x higher dynamic range than typical webcams, and often include features like H.264 compression next to raw and Motion-JPEG output. A cheap colour reference card placed into the background then helps to produce a better, more stable white balance. It's not perfect, but it produces a much better video than typical webcams and is still a cheap solution.

    Where to find these things? In China, on web sites such as AliExpress.com, or directly from the manufacturers. Just don't search for webcams, but for industrial cameras, from people who are happy to disclose the full technical specs without hesitation.

    • by gTsiros ( 205624 )

      do they allow manual setting of, say, exposure duration?

      • Yes. But when this isn't enough still can one also go for a Raspberry Pi Zero for example. Many SBCs come with a direct CCD link interface and allow one to connect a camera module and read the raw image sensor (i.e. Bayer format) at maximum speed. Such SBCs also sometimes contain H.264 hardware encoders. One can basically build a camera to very specific needs and only one's programming abilities set the limit. And it's still low cost.

  • Did no one proofread it? Also, some GPUs have encode/decode hardware and all modern non-embedded CPUs have had vector instructions for a long time. Dedicated hardware is not always better or faster.

  • 'Nuf said. It's the only way I Zoom from my laptop when WFH.

  • A cheap actioncam is way better than most dedicated webcams, but at $40-$50 they are also more expensive than many webcams.

    So yes, $50 should give you a very good 1080p60 webcam with a good sensor and chipset.

  • Just buy whatever hotchick69 uses

  • âoe manufacturers aren't as invested in what has been a low margin business catering to a relatively small niche of customers.â

    At first glance, the differences between a webcam and a laptop camera may seem like comparing chalk and cheese, but hear me out: the quality of even the best laptop camera on the market is basically on par with the best webcam on the market.

    If the fundamental argument for webcam video quality being so low was that webcams were a niche market, then it stands to reason that

  • I deliver >40 professional development courses per year. Since the pandemic hit, most of those have been online. I need all of the help that I can get to keep my students interested and engaged, so I use what I know from my side gig as a photographer. Good lighting is a must, but it need not be expensive. Camera placement is also important, so I use a short tripod on my desk to get the camera up to eye level, well above screen level for my laptops (yes, I use two.) I also use a very quiet room and a stud
  • by mcswell ( 1102107 ) on Wednesday February 03, 2021 @11:44PM (#61025906)

    The *real* problem is that my webcam doesn't make me look young again. (I'm 70.)

  • In Windows 10, when I test the microphone on my USB webcams, it tells me it's getting between 2 and 5% volume; this is with the volume control turned all the way up. Surely this is inadequate? I understand that these webcams are all using the built-in Microsoft driver, and web search says this has been a problem since Windows 8; the Windows 7 drivers supported a microphone boost, but now the boost is only supported with microphones that use a pin connector. There are supposedly apps that will fix this, b

8 Catfish = 1 Octo-puss

Working...