Back to search


STUD: Accessibility of Computer Vision for the Parking Industry

Alternative title: Tilgjengeliggjøring av datasyn for parkeringsbransjen

Awarded: NOK 1.00 mill.

Project Number:


Project Period:

2021 - 2022

Funding received from:



Thanks to funding from the Research Council of Norway, VIS-TEK has developed a fully wireless camera sensor for smart parking and computer vision applications. What makes our technological development unique is that our camera sensors are self-powered using low-power solar cells. Our wireless sensors can transfer large quantities of data while using relatively small amounts of energy. We were also able to develop our own computer vision software that accompanies our camera sensor technology. After our sensor was developed in-house, an achievement that unfortunately took more time than expected due to numerous supply chain issues, we were able to run a successful MVP pilot test of our prototype sensor. Our camera sensors and computer vision software were able to detect empty parking stalls with a very high level of accuracy. In the next few weeks, we will have ten sensors built from the prototype. A final pilot test will then be completed. It is estimated that in three to four months our technology will be market and mass production ready. When our smart parking and computer vision technology enters the market, it will be significantly more affordable and easier to install than any other similar product currently available. As well, since our camera sensor technology is wireless, it can be mounted virtually anywhere. Our smart sensors are more versatile and capable of being used for more applications than non-wireless technology. Initially, we set out to create a product that would aid parking companies and municipalities in the development of smart parking. But, since our initial pilot test was so successful, we have received inquiries from private property developerscompanies and research organizations interested in using our technology to count motorists, cyclists, and pedestrians as well. Next steps for VIS-TEK include reaching out to municipalities and researchers that have a ‘smart city’ focus. Our technology will give urban planners accurate data on exactly how traffic flows within urban centres. Information harvested from our smart parking and computer vision software will help municipalities to enhance mobility, improve energy efficiency, and strengthen ICT infrastructure. Our purpose as a company for developing our smart sensor and computer vision technology has always been to progress society and meet the world’s ambitious environmental and social goals. With the correct application of our technology, traffic congestion will be eased and greenhouse gas emissions will be reduced. Quality of life and productivity will increase as people spend less time in traffic. Cities will be designed and restructured for better land use and increased livability. VIS-TEK wants to consult with urban developers to discover how exactly our technology can best be utilised to meet our above mentioned company goals. With our innovative technology, bicycle, walkers and car traffic can be observed and studied in real-time week-by-week, day-by-day, and hour-by-hour. Replacing the manual, randomised testing that many cities deploy once or twice a year to test the effectiveness of city infrastructure. Having a clearer and more accurate understanding of daily traffic flow will allow municipalities to innovate and make infrastructure changes for the betterment of society and our environment, faster.

The project had a large impact on VIST-EK as a company as we were able to develop our smart parking technology and computer vision software as well as pilot-test our wireless camera sensor prototype. We also gained valuable market insight that has led to private companies and researchers interested in using our product. VIS-TEK, because of this project, has been able to become an innovator in the field of camera sensor technology. We have become the first company to create a wireless camera sensor that is low-power, uses solar cell technology and can transfer large amounts of data while applying machine learning. Being wireless has also made our technology more affordable than any other similar devices on the market. The impact and effect that this has for VIS-TEK have been larger than anticipated. We are on the way to beating our initial sales projection for 2022 by a good margin. But the positive effects of this project go beyond VIS-TEK. Our product will have a large and beneficial societal impact. Being solar-powered and wireless, our technology is greener than our competitors. As well, when utilized in congested urban areas with a shortage of parking spaces, our technology will save motorists from driving around looking for parking. This will reduce greenhouse gas emissions. There is also the added benefit of saving people time which can increase the quality of life for those living in the city and their overall productivity. Initially, our product was developed to help municipalities deal with the growing problem of limited parking space in urban centres. Our goal has been to help cities achieve smart city development in terms of enhancing mobility, improving energy efficiency, and strengthening ICT infrastructure. But, we have found that research organizations and private companies are also interested in our technology for the valuable data it can provide. Our technology can count any object that a person may be interested in tracking for research and information purposes. This includes the number of vehicles, cyclists, pedestrians, or wildlife in a specific area. Furthermore, our technology is 60-90% more affordable than our competitors, is easier and less invasive to install than our competitor’s product and can be mounted virtually anywhere there is a light source to charge the solar cell battery. This has the effect of positioning our technology to become widely adopted for many more applications than smart parking and computer vision technology have ever been used for before.

Funding scheme: