Search
33 items found for ""
- AR Walking Tour with Sophie Jurrjens
One of our latest experiments at Open Culture Tech involved musician and DJ Sophie Jurrjens, who’s known for her audio walking tours, called Off-Track . Her app offers users an experience where they navigate through nature reserves while listening to a soundscape of electronic music layered with natural ambient sounds through their headphones. For her newest tour, set in the Vliegerbos in Amsterdam, we teamed up with Sophie to see how Augmented Reality (AR) could elevate the experience by adding a visual layer. Using our open-source AR tool, specifically designed to simplify the creation of AR content, participants could instantly access the AR visuals on their phones by scanning a QR code – no app downloads necessary. Sophie developed a 30-minute audio tour, enhanced by three stops (QR-code posters) along the route. These triggers revealed vibrant digital birds, butterflies, and dragonflies that appeared to flutter through the forest. The walk culminated at MurMur, a listening club where Sophie DJ-ed a set, while a massive, floating pink moon – visible only through AR – hovered overhead, blending the digital with the real. During the project, Sophie was introduced to the AR tool and quickly got hands-on with building her own AR experience. While the experiment successfully added depth to her work and proved the tool’s ease of use, Sophie admitted that she felt overwhelmed by the sheer number of possibilities. She noted that clearer examples, like tutorials on how to animate AR objects, would have helped streamline the creative process. We’re incorporating this feedback as we continue to refine the tool for future projects. As immersive technology evolves, Sophie’s project demonstrates how AR can blend natural environments with digital elements, creating a layered experience that goes beyond the typical music experience.
- Christmas party with Linde Schöne
Linde Schone is a singer and songwriter who has created a live album. It was her wish to create an over-the-top and engaging Christmas performance to immerse the audience in the joy and happiness of that time of year. Together we created a live show with live avatars and mobile AR Christmas scenes. For years already, Linde has had the wish to create a Christmas live show to accompany her Christmas album. Her wish was to create a grandiose Christmas experience with flying elves, background dancers and Chistmas decor. We decided to use our avatar system to bring to life the background dancers as dancing elves, based on the movements of Linde. To do so, we created a Christmas scene in the avatar system with Christmas trees and built an avatar dressed as a Christmas elf. For this, we used the various sliders in the system to create the right settings and we used a 3D elf-character we bought on the Unreal Engine Marketplace. At that time we were not able yet to set multiple avatars in the system so we could only display one avatar on screen. Spoiler: now you can create a whole crew of background dancers in our system. To capture Linde’s body movements, we set up a small camera at the front of the stage, angled at her. To create an immersive Christmas experience, we carefully designed the stage setup to complement Linde’s performance. At the back of the stage, a short-throw projector was used to display festive visuals on the backdrop. A short-throw projector is designed to project a large image from a short distance, making it ideal for smaller venues. Throughout the performance, this projection showcased warm and cozy Christmas scenes, including crackling fireplaces, twinkling lights, and snowfall, enhancing the holiday atmosphere. The aim was to integrate our avatar tool halfway through the show. Linde’s movements would be tracked in real time by our mocap camera and translated into the movements of the 3D elf, allowing them to dance synchronously. This would create a unique and dynamic interaction between the real and virtual worlds, making the performance even more magical. Although Linde was on fire and performed a great set, we ran into troubles with the live avatars and the motion capturing. We had too little time to do a final check of the motion capturing prior to the show. The motion capture did not work well because of a bug in our system (which is now fixed) and the lighting at the stage. The lights were shining in the camera and because of that we were unable to capture Linde’s movements in a good way. Eventually Linde was luckily able to laugh it off and continue her performance in good fashion. To make the performance more engaging for the audience, we decided to add 2 mobile AR scenes in which we created a snowy Christmas scene with Linde on top of a stack of Christmas presents. As well, we created a scene in which elves were flying through the room. To stay in the mood, we printed the QR code of the scenes on the back of a Christmas card that could be sent to loved ones.
- General Progress Update
At Open Culture Tech, we are now three-quarters of the way through the project, making this a good moment to briefly reflect on what we've done so far, but more importantly, to look ahead to what we still have planned. Over the past few months, we’ve hosted a series of successful showcases featuring a diverse selection of artists. Among them, OATS, Smitty, Eveline Ypma, Vincent Höfte, Sophie Jurrjens, Jan Modaal, and Ineffekt have all incorporated AI, AR, and avatar technology into their work. Additionally, collaborations are in the works with Linde Schöne, Nana Fofie, DJ Casimir, and Alex Figueira. You can check the performance dates at www.openculturetech.com/agenda These collaborations with artists have so far resulted in three main tools: AR Tool: An open-source tool that allows artists to create their own mobile augmented reality experience Avatar Tool: An open-source tool enabling artists to create avatars for live visuals and connect them to a webcam for motion tracking AI Tool: An open-source tool that allows you to generate musical samples without collecting data Delivery We are currently working on delivering the tools. To facilitate this, we are redesigning our web portal so that all the tools can be easily found and downloaded if needed. This portal will also be linked to GitHub, where all open-source code can be accessed. As we finalize the tools, we are also exploring suitable licensing models to make everything accessible to artists, while possibly charging a small fee to commercial entities. Sneak Preview: Open Culture Fest On February 3rd, we will present all project results at Open Culture Fest in the Melkweg, Amsterdam. The day will be filled with live showcases, workshops, presentations, and a networking event. Stay tuned for more info. Next Steps In collaboration with the Netherlands Institute for Sound and Vision and AIxDesign, we’ve submitted a follow-up funding proposal to Cultuurloket Digitall. With the next phase of the Open Culture Tech project, we aim to make the initiative even bigger and more accessible. Our plans include Structuring AI & XR workshops that will be freely accessible to artists Setting up residencies where artists are paired with a tech partner to co-develop a live performance Expanding the toolkit with six new tools Opening the toolkit to external developers so they can contribute their own tools Collaborating with educational institutions to further embed the Open Culture Tech philosophy in music education
- Augmenting Halloween
High Tea is an organization that promotes new and established names in the drum 'n bass scene by organizing live shows and festivals. High Tea events often have a theme that is applied in the decoration of the location and the branding of the event. On October 25th, in a sold out Melkweg Amsterdam, the theme was Trick or Treat. In other words: Halloween. With our AR tool, High Tea developed a number of mobile Augmented Reality experiences to give the Melkweg a spooky theme. For High Tea, the focus of the event is on the one hand promoting the artists and on the other hand organizing a great evening program for the audience. Because it is important that the attention is not distracted too much from the artists, we decided in consultation that we would use AR for the decoration of the event outside the main hall. The AR is therefore visible outside in the line where people are waiting to get in and in various places in the lobby and toilets. Within our work, we are not only looking for the best way to apply AR for pop stages, but we are also looking for good workflows so that we can discover who should do what when you want to use AR. In this use case, we coordinated the process with Melkweg, which facilitates the space, and the organizer of the evening, High Tea. High Tea asked two VJs from their own organization to design the AR visuals and Melkweg was consulted about where these could best be placed. The VJs designed five AR experiences that they placed outside and inside the building as a kind of treasure hunt. They had printed posters with QR codes for this purpose so that they could scan the experiences. The VJs indicated that they were able to handle the AR tool well, but that they would like to have a little more control over the movements of the objects. It is not always clear what the different values in the tool mean in relation to the size or speed of a movement. They also indicated that they would benefit greatly from a timeline functionality where you can link a change of AR scene to a time. The idea of using AR to decorate the space and to bring the concept of the party to life worked well. The AR offered added value and provided an extra dimension in the decoration of the party. Melkweg clearly has a facilitating role for the party and it therefore worked well that the organizer of the party, High Tea, designs the AR experiences themselves. and coordinates with Melkweg how they can best offer them on the evening itself. On the other hand, it became clear during the evening that many people do not know what AR is. It therefore has little added value to promote the experience as 'AR'. It is better to call it an extra experience or give it a different title. The placement of the QR codes also turned out not to be correct. As a result, the QR codes of the AR experience were confused with functional QR codes of, for example, the lockers. It was also not always clear that they were different experiences and that it was therefore interesting to scan all the individual posters and view the experiences. In conclusion, the evening has taught us that the examined workflow has a lot of potential, but that we need to look more critically at how we offer the QR codes. The conceptual embedding of the AR experience in the horror theme offered clear added value for the party.
- Jandroid Modaal live in concert
Jan Modaal is a punk “smartlappen” artist. Jan is energetic and expressive on stage, but during his performance, he needs to focus on making his music. That's why we looked for innovative ways to use new technology to enhance Jan's physical expression – and his connection with the audience. This challenge led to the creation of the Jandroid Modaal concept. In Jandroid Modaal, the physical merges with the digital in a future-punk performance centered around avatar technology. The technical foundation for this performance is our Open Culture Tech “Avatar Tool”, which we also used for OATS and Smitty. To enhance Jan’s physical expression, we focused on his facial expressions and body language. To improve our prototype, we built a new feature into our Avatar Tool that allows live motion tracking based on camera input—instead of an expensive motion capture suit. In preparation for this show, Jan explored new ways to design himself as an avatar, resulting in six different virtual alter egos, each with corresponding background images. These backgrounds consist of AI-generated videos created in Runway ML, a generative AI video tool. These six avatars then formed the basis for a new musical repertoire, with electronic, synthetic music as the foundation. To present his six avatars on stage, Jan decided to hang a curtain in front of the stage and stand behind it, while a projector displayed the avatars on the stage. Behind the screen, Jan was filmed by a webcam that tracked his movements and transmitted them to the Avatar Tool, which animated real-time 3D characters. A second camera captured his facial expressions so that his mouth and eye movements could be copied in real time to the avatar. The setup with motion tracking and performing behind the projection screen posed a challenge for the lighting. Some lighting was needed behind the stage to track Jan’s movements, as the cameras couldn’t register him in complete darkness. At the same time, it had to be as dark as possible behind the stage because we projected onto the screen from the front, and backlighting made the projections less visible. For this reason, we placed a small light on stage. However, it broke halfway through the show and was immediately replaced with a backup light, which turned out to be much brighter. As a result, the audience could see Jan’s silhouette through the screen. In hindsight, many people in the audience only then realized that the avatars were responding to Jan’s movements in real-time. This unexpectedly became a great addition to the show. Halfway through the show, Jan began tearing the screen apart, making himself visible to the audience. At first, he cut out a square, standing in a kind of frame, with projections still surrounding him. As the show progressed, the entire digital world came down with the screen, and Jan finished the show in his physical form. This show took place in a small music venue at Cinetol, Amsterdam, and is currently being further developed. If you're curious about more, keep an eye on our schedule for updates.
- AR and live venues
Mobile Augmented Reality (AR) is a big part of the Open Culture Tech project. Together with Ineffekt, Vincent Hofte, Smitty and Sophie Jurrjens we have developed unique tools and practices that have proven to be both accessible and valuable. But during all these showcases, we also learned that the physical environment plays an important role in the successful application of AR. This raises the question, are live venues prepared for a future where AR is increasingly used? To answer this question, Thunderboom Records initiated a spin-off project called Next Stage AR. Together with the Dutch venues Paradiso and Melkweg and the open-air theater Bostheater Amsterdam, we are investigating how venues can use AR to support artists and how they can expand their own services. Whereas Open Culture Tech looks at technology from the perspective of artists, Next Stage AR takes this a step further and looks at possibilities of venues to unlock the potential of AR in the context of marketing or wayfinding. For this project, we extend on the current AR tool that we have created for the Open Culture Tech project, which is open-source and freely available. The setup of Next Stage AR is the same as Open Culture tech. We build prototypes, test them with audiences, and then make improvements. Ultimately delivering an open-source toolkit and best practices that we share with the sector. We already have started a series of live concerts in which we have tested various possibilities and workflows. We started this project with a first test event in Paradiso, called Café Vanguard. This is an evening where upcoming artists get the opportunity to perform a short live set. This was a good starting point since there were 4 different artists and a designated VJ who was very willing to experiment with the tool. We asked the venue, who was organizing this event, to discuss the use of AR with the performing artists. Three of them were willing to try this. We then asked a VJ to collaborate with them on creating an AR experience for one of their songs. The VJ, called VJ Bikkel, managed to exceed our expectations. He asked the artists for their ideas and made this come to live in three very engaging AR experiences. VJ Bikkel added transparent layers with galaxies to the experience which gave it a very spatial effect. The concert was a seated event, so we put cards with QR codes on the tables so that the audience knew what to do. Two academic researchers, Femke Vandenberg and Frank Kimenai, are affiliated with the Next Stage AR project. They are conducting research into both the audience experience and the way in which these types of innovations integrate into the music sector. The involved researchers interviewed some of the visitors, artists and the VJ afterwards. This produced some valuable insights. Some first insights: Not all artists were aware of what they said yes to and they had difficulties understanding what the AR experience was going to be. One artist forgot to mention that the AR was happening during the show. Because of this we learned that it is very important to announce the AR clearly to the audience. The VJ was very enthusiastic about our tool being free and open-source. He would like some more functionalities which we have put on our development timeline. Based on the VJ input we have also incorporated the functionality of opacity. The Next Stage AR project will run until mid-2025 and has a lot of overlap with Open Culture Tech. This means that we will combine the insights from both projects in our best practices and functionalities of our open-source AR tooling. Keep an eye on the Thunderboom Records channels if you want to stay up to date on our live shows, presentations or demos.
- Discover our new AR Tool
We are proud to announce that the new version of our creative Augmented Reality tool has been launched. With this tool, artists can easily build their own mobile AR experiences. Some examples of these applications were previously shared in this newsletter as part of the live shows of Vincent Höfte, Smitty and Ineffekt. To refresh your memory: Augmented Reality (AR) is a technology that adds digital elements to the real world, often via devices such as smartphones or AR glasses. This allows users to interact with virtual objects and experience information integrated into their immediate environment. AR has proven its value as an immersive application. For example, the Gorillaz presented their new single in AR at Times Square, and Coachella collaborated with Snapchat to present an AR experience that included art and wayfinding. For many artists, AR can enhance live shows and add props without the need for extensive physical setups. Despite the fact that there are several AR tools on the market, we have chosen to build our own tool – specifically for the context of live shows for emerging musicians. We chose this approach because most existing tools are overly complicated, time-consuming, and insufficiently protect the user's intellectual property. Our tool simplifies the creative process with a user-friendly interface designed for artists without prior design experience. Additionally, it only stores necessary data and is not reused for commercial purposes. What's new Our new OCT AR tool is inspired by the concept of a synthesizer. It allows you to create and arrange scenes in 3D space, where you can place and publish images. Similar to a real synthesizer, it features various oscillators (or parameters) that you can adjust to add effects and movement to the elements within your scenes. Check it out yourself: https://ar.sprps.dev Please note that you will need to create the content for the AR application yourself. In the near future, you will also be able to integrate 3D content from existing online libraries, but this feature is not available yet. To explain all the functionalities, we have made a short instructional video: https://vimeo.com/974167199 Current features: Create an AR project (scene) that can be viewed with one tap on iOS and Android devices after scanning a QR code; no app download required. Create multiple scenes per project/QR code and switch between scenes during your live show. Place 2-dimensional objects (planes) in a 3D workspace. Add transformations (move, rotate, scale) to an object. Add an image texture to an object; supports JPEGs and (transparent) PNGs. Add a transparency mask to an object (make objects more or less transparent with a colored background). Add animations to an object. Stack transformations and animations to create more complex movement. Planned features: Reorder transformations. Group objects. Support for video/animated GIF texture. Import 3D objects.
- Avatars & AI visuals with Jan Modaal
Jan Modaal is a punk “smartlappen” singer from Amsterdam. He is an expressive artist who highlights social themes in his music and lyrics. On August 24, he will play live at Cinetol in Amsterdam. Below, we will give you a sneak peek of the conceptual and technical content of this evening. As a stage performer, Jan Modaal is the frontman and guitar player in various punk bands. Apart from playing the guitar and singing, Jan has limited opportunities to express himself physically and visually on stage. For this reason, Jan and Open Culture Tech are exploring how we can augment his face and body movement – and translate this into real-time visuals using our own Avatar creator tool. With the latest version of our Avatar creator tool, it is possible to upload yourself as a 3D character (with Polycam ) and animate this avatar in real-time using a webcam that copies your movement. In our first live show with OATS, we did this by using a motion capture suit, but this was too expensive and tech-heavy. So for Jan we developed a webcam feature. In addition, it is possible to place the Avatar in a 3D space and adjust the background environment. This 3D creation can be live streamed and projected on a large screen. From this technical basis, a concept has been developed for the live show of Jan Modaal, called “Jandroid Modaal”. During this show, large curtains are hung at the front of the stage. Jan will stand behind these curtains and perform his show while his movements are recorded by the webcam – which turns his movements into 3D motion. A livestream of this animated Avatar is then projected onto the curtain on stage. The background environment is decorated with videos that Jan Modaal makes with Runway. This is a popular text-to-video AI tool that allows you to easily generate short videos based on text prompts. In these videos, various “future scenarios” are generated from different “punk” angles. Cyberpunk, solarpunk, steampunk, biopunk, etc. In our Jandroid Modaal show, the fusion between digital and physical reality is central, while Jan explores the tension between these two worlds. During the live performance, Jan will play with the world in front of the screen and the world behind the screen. We are not revealing exactly how this will be, this can be experienced live on August 24 at 8:00 PM in Cinetol Amsterdam. The use case with Jan Modaal has resulted in a series of new functionalities for our own Avatar creator tool that we will soon release on the website of Open Cultuur Tech.
- Showcase: AR and AI soundscapes
Eveline Ypma is a soundscape artist from Amsterdam and Vincent Höfte is a jazz pianist from The Hague. Together with the Open Culture Tech team, both artists have worked on two unique concepts in which we use our own generative AI and mobile Augmented Reality prototypes to enrich their live performances. In this article we will briefly take you through our journey. Eveline Ypma & AI samples Laugarvatn is an existing production created by Eveline Ypma, consisting of three parts of 5 minutes each. The performance is named after a place where Eveline Ypma made several field recordings during her residency on Iceland. These field recordings form the basis for three soundscapes in which she combines these field recordings with live vocals and bass guitar. Together with the Open Culture Tech team, a fourth part of 10 minutes has been created in which the Icelandic field recordings have been replaced by AI-generated sound samples, in the style of her original Icelandic field recordings. To generate original samples, Eveline played with various text-to-music tools (ChatGPTs for music). During her residency on Iceland, Eveline never saw the Northern Lights, so she decided to use AI to generate unique sound samples based on the prompt “Northern Lights Soundscape”. In this way, Eveline was able to create new music inspired by her journey and add a musical piece to her already existing work Laugarvatn. The result of the collaboration between Eveline Ypma and Open Culture Tech is not only a beautiful showcase in which we have used generative AI to generate unique samples for live performance, but also the first version of our own open-source AI tool that allows anyone to sample their own samples. can create based on prompts. If you are curious about the process of creating this tool, and want to know more about how this application came about, read the detailed article here . And stay tuned, the Open Culture Tech AI-driven sample tool will be published soon. Vincent Höfte & mobile AR Vincent Hofte is a jazz pianist who regularly plays on public pianos at train stations throughout The Netherlands. Together with Open Culture Tech, Vincent has created a short performance in which he plays his own piano pieces while a mobile Augmented Reality filter adds a visual layer to reality. By scanning a QR code with your smartphone, you see colored shapes floating through the train station. These shapes are remixed photos of the station hall itself, creating a mix between the architecture of the station and the repeating shapes in the photos. This show used the first version of our own Augmented Reality app, which we will publish for free and publicly in a few months. If you are also curious about the process of creating this application and want to know more about how this application was created, read the extensive article here .
- AI and music with Melle Jutte
Melle Jutte, R&B artist and producer from Amsterdam, has always had a curious mind when it comes to new technologies. Together with Open Culture Tech, he dives into an experiment with the aim of composing four to five new songs, using a different AI tool for each song. So far, Melle has written three tracks, using three different methods. All these tracks are put into a live show by Melle to investigate how this new form of composing works in a live context. His first experiment was with Suno, a popular AI tool that generates music. Melle had Suno generate a piece of music and then played it himself on his instruments. Although he put his own spin on this, it mainly felt as if he was imitating someone else's music. The process was technically impressive, but artistically less satisfying. It gave him a feeling of limited control, which hindered his creativity. Nevertheless, Melle continues to experiment with Suno to see whether he can ultimately achieve a satisfactory result by finding the right balance between the generated basis and his own instrumental influence. Next he tried Magenta, an AI tool that generates single MIDI beats and melodies. Despite the interesting possibilities, Melle often found the result dry and random. The beats and melodies generated had little coherence and the lack of coherence meant he had to spend a lot of time adjusting and piecing together the generated music. The third experiment took him to Udio, a tool similar to Suno. Instead of playing the generated music, Melle split the audio tracks and used the individual tracks as samples. This gave him the freedom to play and experiment with sounds in a way that he found very inspiring. Manipulating the samples gives him the opportunity to be truly creative, without feeling limited by the original structure of the music generated. For the other experiments, Melle is curious about the potential of MIDI in a less random setting. He is considering playing with tools such as ChordChord, AIVA and MusicLang, and also wants to explore what he can achieve when writing lyrics using ChatGPT. He is especially curious how these tools can contribute to a more structured and coherent creative process, while still retaining the freedom to follow his own artistic vision. Melle's research consciously focuses on the artistic potential of generative AI technology, where - unlike Eveline Ypma's Open Culture Tech project - he does not pay attention in advance to copyright and terms of use. Melle is aware of the risks and ethical dilemmas associated with the use of AI, but his focus is on freely exploring the artistic possibilities. Reflection on the complications of his creations only follows after the music has been created.
- Showcase: Avatars
Smitty is a rapper from Amsterdam who talks to different versions of himself in his music. He talks to his youngest self and older self about different aspects of life. In his live shows he wants enhance this concept through new technology. Together with Open Culture Tech, Smitty has developed a live show that uses our immersive Avatar technology and mobile Augmented Reality to make this happen. The evening was composed of three parts. The first part consisted of the audience reception where we used mobile AR to introduce the audience to Smitty's lyrics. The second part consisted of the live show in which we projected various 3D avatars of Smitty on the white walls of the hall. The third part consisted of a Q&A between the audience, Smitty and members of Open Culture Tech. The entrance Upon entry, large QR codes were projected on the screen to access the experience. To highlight the lyrics of Smitty's music, we created an AR experience with the Open Cultuur Tech AR app. The idea behind this was that we created a virtual world where Smitty's lyrics floated through space. In the AR experience, 5 different texts from Smitty were spread throughout the room. Visitors could walk through the white empty space of @droog and see the different texts, in the same way as you would at an art exhibition. The AR experience was a warm-up before the show. The show In order to make the 3D avatars as prominent as possible, we wanted to create the illusion of an LED wall in the @droog. An LED wall is a wall of LED screens on which you can play visuals. Such a wall is very expensive and therefore unfeasible for most smaller stages. In addition, LED requires some distance between the audience and the screens to provide a clear image. This is also difficult in many smaller stages. We solved this by installing two projectors that were of good enough quality to project onto the walls. The projections had to run from the ceiling to the floor because otherwise it still looks like you are looking at a normal projection. The projectors were aligned in such a way that they projected onto the walls on either side of the stage. This resulted in minimal shadows from the band on the projections. Various atmospheric images were projected on this back wall to support the show. These atmospheric images were a combination of free videos from, for example, Pexels and your own video recordings. After the second issue, Smitty's first 3D avatar was introduced on screen. This animated 3D avatar was a younger version of Smitty who slowly turned towards the audience. An older version of Smitty was then shown and these avatars were edited together. The different avatars, in different animations, built up to an eclectic mix that worked towards a climax. Because we did not want to show the avatars for the entire show, but also wanted to show other atmospheric images, we created a simple VJ setup via TouchDesigner, a software tool with which we could build our own video player. This way we could control the visuals on the projections with a midi controller. Using an HDMI splitter we could control both projectors with 1 laptop. An important condition for using projectors is that there cannot be too much light in the room because the projections will then become less visible. In Smitty's case, the projections provided enough light to illuminate the room. With two small RGB spots and a white spot on Smitty himself, it was sufficient to properly illuminate the stage. The Q&A In addition to music lovers, the audience also included many musicians and fellow rappers of Smitty. For this group, LED walls, animated avatars and augmented reality are not within reach. From the conversations with the audience it became clear that they found the show, which lasted approximately 45 minutes, impressive. The visuals added a valuable layer and supported Smitty's story from the content. This confirmation is important for the progress of Open Culture Tech to validate that our technology is usable for the target group. Follow-up agreements have been made with various fellow rappers to investigate how the Open Culture Tech toolkit can be used more broadly within the Amsterdam hip-hop community. To be continued.
- Summary Report
This report is a summary of results we collected in the first 7 months of the Open Culture Tech project. We surveyed more than 100 artists and other relevant stakeholders from the music industry. We have done this in knowledge sessions, guest lectures, workshops at conferences (such as ADE and ESNS), surveys and interviews. Picture by Calluna Dekkers It is important for the Open Culture Tech project to map out where the opportunities and challenges lie for emerging artists. This way we ensure that we develop technology that meets the needs of artists and their audiences. It is also important to share more information with the sector. It is important for artists to know what their options are, even with small budgets and limited technical knowledge. It is important for the broader industry to know how they can facilitate artists and how we can ensure that the Dutch music landscape does not fall behind compared to, for example, the American or Chinese music industry. LINK to full Summary Report