Postingan

IMAGE BESTNew understanding of color perception theoryAMMJENACIONAL

Gambar
From phys.org a news article about a recent paper that casts doubt on the traditional understanding of how human color perception works: "Math error: A new study overturns 100-year-old understanding of color perception": A new study corrects an important error in the 3D mathematical space developed by the Nobel Prize-winning physicist Erwin Schrödinger and others, and used by scientists and industry for more than 100 years to describe how your eye distinguishes one color from another. The research has the potential to boost scientific data visualizations, improve TVs and recalibrate the textile and paint industries. The full paper appears in the Proceedings of the National Academy of Sciences vol. 119 no. 18 (2022). It is titled "The non-Riemannian nature of perceptual color space" authored by Dr. Roxana Bujack and colleagues at Los Alamos National Lab. The scientific community generally agrees on the theory, introduced by Riemann and furthered by Helmholtz and S...

IMAGE BESTDirect ToF Single-Photon Imaging (IEEE TED June 2022)AMMJENACIONAL

Gambar
The June 2022 issue of IEEE Trans. Electron. Devices has an invited paper titled Direct Time-of-Flight Single-Photon Imaging by Istvan Gyongy et al. from University of Edinburgh and STMicroelectronics.  This is a comprehensive tutorial-style article on single-photon 3D imaging which includes a description of the image formation model starting from first principles and practical system design considerations such as photon budget and power requirements. Abstract: This article provides a tutorial introduction to the direct Time-of-Flight (dToF) signal chain and typical artifacts introduced due to detector and processing electronic limitations. We outline the memory requirements of embedded histograms related to desired precision and detectability, which are often the limiting factor in the array resolution. A survey of integrated CMOS dToF arrays is provided highlighting future prospects to further scaling through process optimization or smart embedded processing. Full paper...

IMAGE BESTCFP: International Workshop on Image Sensors and Imaging Systems 2022AMMJENACIONAL

The 5th International Workshop on Image Sensors and Imaging Systems (IWISS2022) will be held in December 2022 in Japan. This workshop is co-sponsored by IISS. -Frontiers in image sensors based on conceptual breakthroughs inspired by applications- Date: December 12 (Mon) and 13 (Tue), 2022 Venue: Sanaru Hall, Hamamatsu Campus, Shizuoka University  Access: see https://www.eng.shizuoka.ac.jp/en_other/access/ Address: 3-5-1 Johoku, Nakaku, Hamamatsu, 432-8561 JAPAN Official language: English Overview In this workshop, people from various research fields, such as image sensing, imaging systems, optics, photonics, computer vision, and computational photography/imaging, come together to discuss the future and frontiers of image sensor technologies in order to explore the continuous progress and diversity in image sensors engineering and state-of-the-art and emerging imaging systems technologies. The workshop is composed of invited talks and a poster session. We are accepting approx...

IMAGE BESTSigma Foveon sensor will be ready in 2022AMMJENACIONAL

Gambar
From PetaPixel : Sigma’s CEO Kazuto Yamaki has revealed that the company’s efforts in making a full-frame Foveon sensor are on track to be finished by the end of the year.  Sigma’s Foveon sensors use a proprietary three-layer structure in which red, green, and blue pixels each have their own full layer. In traditional sensors, the three pixels share a single layer in a mosaic arrangement and the camera “fills in” missing colors by examining neighboring pixels. Since each pixel of a photo is recorded in three colors, the resulting photo should be sharper with better color accuracy and fewer artifacts. The release had been delayed on at least two occasions in the past due to technical challenges, once in 2020 and again in 2021. The initial announcement about this sensor was made back in 2018. In February 2022, Yamaki indicated that the company was in stage 2 of testing, and the final third stage will involve mass-production testing.

IMAGE BESTProphesee interview in EETimes AMMJENACIONAL

Gambar
EETimes has published an interview with CEO of Prophesee about their event sensor technology. Some excerpts below.   Prophesee collaborated with Sony on creating the IMX636 event sensor chip.   Meaning of "neuromorphic" Most companies doing neuromorphic sensing and computing have a similar vision in mind, but implementations and strategies will be different based on varying product, market, and investment constraints. ... ... there is a fundamental belief that the biological model has superior characteristics compared to the conventional ... Markets targeted ... the sector closest to commercial adoption of this technology is industrial machine vision. ... The second key market for the IMX 636 is consumer technologies, ... the event–based camera is used alongside a full–frame camera, detecting motion ... correct any blur. Prophesee is also working with a customer on automotive driver monitoring solutions... Applications here include eye blinking detection, tracking or face tr...

IMAGE BEST3D cameras for metaverseAMMJENACIONAL

Gambar
Press release from II-VI Inc. announces joint effort with Artilux on a SWIR 3D camera for the "metaverse". https://ii-vi.com/news/ii-vi-incorporated-and-artilux-demonstrate-a-3d-camera-for-enhanced-user-experience-in-the-metaverse/   PITTSBURGH and HSINCHU, TAIWAN, July 18, 2022 (GLOBE NEWSWIRE) – II‐VI Incorporated (Nasdaq: IIVI), a leader in semiconductor lasers, and Artilux, a leader in germanium silicon (GeSi) photonics and CMOS SWIR sensing technology, today announced a joint demonstration of a next-generation 3D camera with much longer range and higher image resolution to greatly enhance user experience in the metaverse. Investments in the metaverse infrastructure are accelerating and driving the demand for sensors that enable more realistic and immersive virtual experiences. II-VI and Artilux combined their proprietary technologies in indium phosphide (InP) semiconductor lasers and GeSi sensor arrays, respectively, to demonstrate a miniature 3D camera that operates in ...

IMAGE BESTReview of indirect time-of-flight 3D cameras (IEEE TED June 2022)AMMJENACIONAL

Gambar
C. Bamji et al. from Microsoft published a paper titled "A Review of Indirect Time-of-Flight Technologies" in IEEE Trans. Electron Devices (June 2022). Abstract: Indirect time-of-flight (iToF) cameras operate by illuminating a scene with modulated light and inferring depth at each pixel by combining the back-reflected light with different gating signals. This article focuses on amplitude-modulated continuous-wave (AMCW) time-of-flight (ToF), which, because of its robustness and stability properties, is the most common form of iToF. The figures of merit that drive iToF performance are explained and plotted, and system parameters that drive a camera’s final performance are summarized. Different iToF pixel and chip architectures are compared and the basic phasor methods for extracting depth from the pixel output values are explained. The evolution of pixel size is discussed, showing performance improvement over time. Depth pipelines, which play a key role in filtering and enhanc...