ifbbw seit 2010 ihr Video und Livestreaming Dienstleister
ifbbw Institute for Image Movement - Your video and live streaming service provider for digital video content and live broadcasts

VIDEO WIKI

absorber, Diffusers and Bass traps are acoustic elementswhich are used to Optimize sound in rooms – especially in recording studios, rehearsal rooms, home theaters, or event spaces. They influence how sound waves behave in the room.

1. absorber

Goal: “Swallowing” sound waves, so Reduce reverberation and reflections

  • Usually made of porous materials (e.g. foam, mineral wool, textile)

  • Works particularly well in medium to high frequency range

  • Prevent the room from sounding “echoey” or “rattling”

  • Example: foam panels on the wall, ceiling absorbers

2. Diffusers

Goal: scatter soundinstead of absorbing it or directly reflecting it back

  • Consist of hard, irregular surfaces (e.g. wood, plastic)

  • Disperse sound waves in many directions → more uniform room acoustics

  • Maintain the “liveliness” of the sound without disturbing echoes

  • Especially useful in Control rooms, studios, concert halls

3. Bass Traps

Goal: Control low-frequency sound waves (bass)

  • Bass waves “accumulate” in corners of the room – bass traps are placed there

  • Usually made of dense, absorbent materials in triangular or cylindrical shape

  • Working in low frequency range (below 250 Hz)

  • Prevent booming or imprecise bass

A production director (also Showcaller or Show director called) ensures that at events, shows or live productions everything goes exactly according to planto the second.

Tasks of a production director:
1. Responsibility for the show
  • Coordinates the entire production schedule

  • Make sure that all program items start and end at the right time

2. Communication with all trades
  • Gives stage directions via intercom (e.g., “Sound off,” “Lights on cue 4,” “Camera 2 live”)

  • Coordinates technology, camera, lighting, sound, moderation, stage team, etc.

3. Live control during the show
  • Often sits in the control room or in a central position with a headset

  • Calls every cue (i.e. every sequence command) precisely at the right moment

4. Preparation & scheduling
  • Creates the Schedule or Cue sheet (every second and action of the show is recorded exactly)

  • Works closely with directors, production, technical management and, if necessary, clients

5. Interface between creativity and technology
  • Translates creative ideas into technical processes

  • Ensures that all trades speak the same language – and act simultaneously

Typical areas of application:
  • TV shows and live formats

  • Conferences, trade fairs, galas

  • Product presentations, live streams, company events

  • Fashion shows, concerts, hybrid events

Conclusion:

The production director holds all the strings in complex productions and ensures that everything happens “on time” and “on point” – this often only becomes visible when when something goes wrong. If everything runs perfectly, it was usually good direction.

Production Manager (Film & Video)

The recording manager is a central organizational function in film and video production. It is responsible for the Planning, coordination and execution of filming – from the first day of shooting to the last take.

The recording manager ensures that the Shooting smoothly, efficiently and within the given time and budget She is the link between production, direction, technology, casting, and all other departments on set.

Overview of the recording manager’s tasks:
  • Creation of the shooting schedule based on the script and in consultation with the director and production

  • Organization of filming locations, filming permits, accommodation, transport and catering

  • Coordination of cast, crew and equipment on set

  • Communication with authorities, locations and service providers

  • Create daily schedules (daily schedules for the team)

  • Monitoring the schedule during filming

  • Contact person for all organizational questions on set

Important to know:
  • In large productions there are often several levels:

    • Production Manager (AL) – higher-level organization

    • Set Manager – direct organization on location

    • Production Manager – overall budget and personnel responsibility

  • The production manager works closely with the assistant director, production manager and location manager.

Aim of the recording manager:

A well-planned and stress-free shoot in which everyone involved in the right place at the right time are – with everything they need.

Video blending in live streaming refers to the process of combining different video sources in real time to create a final, composite image displayed during a live stream. This process involves merging disparate content—such as camera feeds, graphics, pre-recorded footage, or external sources—and transforming it into a coherent visual presentation.

Typical applications of image mixing in live streaming include:
  1. Camera change: In a live stream that uses multiple cameras, the vision mixer can switch between different camera perspectives to always give the viewer the best view of what is happening.

  2. Overlays and graphics: During a stream, text overlays (such as names or scores in sports broadcasts), logos, animations, or other visual elements can be added to make the stream more informative or engaging.

  3. Picture-in-Picture (PiP): A small image from a camera is inserted into the main image to show, for example, an interview or a second perspective.

  4. Transitions: The vision mixer can insert smooth transitions such as fades or wipe effects between different scenes or camera perspectives to make the stream appear smooth and professional.

In professional live streaming, a Image mixer or a Video switcher used to control these various video sources and edit them in real time. For less complex streams, software solutions such as OBS Studio or vMix can be used, which also enable these functions.

Image mixing helps to create a well-structured and dynamic live stream that is visually appealing and informative for viewers.

A CDN (Content Delivery Network) is a global network of worldwide distributed and interconnected servers that ensures that Web content such as images, videos or web pages reach the user quickly and reliably.

In short:

A CDN stores copies of content on servers near the user to to shorten loading times, Reduce server load and to increase availability.

Example: If someone from Germany accesses a US website, a European CDN server delivers the content – faster and more efficiently.

A cinematic look (also) cinematic look) In the film industry, this describes a visual style reminiscent of major cinematic productions. It ensures that an image looks "like a movie." atmospheric, high-quality, emotional and consciously designed.

What does "cinematic look" mean?

A cinematic look is created when several design and technical elements work together. He ensures that images appear cinematic, aesthetic, and profound. – in contrast to a more „flat“ or „video-typical“ look.

In short:

A cinematic look makes a video appear as if it were a scene from a movie – emotional, high-quality, and thoughtfully designed.

Color grading in video production is the creative process of adjusting and styling the colors of a video after shooting to create a specific mood, atmosphere, or consistent look. It's an important part of post-production and goes beyond purely technical color correction.

Here is a brief overview:
What happens during color grading?
  1. Create mood:
    Colors influence emotions. Warm tones can evoke a sense of security or nostalgia, while cool tones create a sense of distance or gloomy.

  2. Visual consistency:
    Color grading helps to create a consistent overall image, especially for scenes shot at different times of day or in different locations.

  3. Look & Style:
    Certain films or series have a clearly recognizable look – for example, the greenish-cool style in The Matrix or the desaturated western look in Breaking Bad.

  4. Artistic expression:
    By deliberately changing colors, contrast, saturation and light levels, a unique visual style can be created.

Corner Logos (or Corner logos) are logos that are typically placed in one of the corners of an image, video, or web page. They often serve as a branding element without overdoing the main content.

Typically, you see corner logos in areas like:
  1. Live streams or videos: Here, the logo of a company or brand is placed in one of the corners of the video to keep the brand visible throughout the broadcast without compromising the viewer experience.

  2. Websites: On websites, the logo can also appear in a corner to show the identity of the site without distracting too much from the content.

  3. Marketing materials: On printed or digital materials, the logo can appear in a corner to create a subtle brand presence.

A corner logo is therefore a subtle way to promote brand identity without distracting the user's attention from the actual content.

Dedicated IP connections (also: dedicated lines or Dedicated Internet Access) are exclusive, permanently assigned data connections, the be used by only one customer – in contrast to “Shared Connections”, which are shared by several users.

One dedicated IP connection is like your own digital highway between your location and the Internet or another network –
fast, stable and without any traffic jams caused by other users.

Practical example:

For a large livestream event or a remote directed production, a dedicated fiber optic line with static IP switched on so that:

  •  the bandwidth belongs only to you – no performance loss due to others
  • the signal is transmitted without interruption or delay

  • the director has secure, direct control

  • there are no performance drops due to other users

A Digital Imaging Technician (DIT) is a specialized professional in film and video production who plays a central role in the digital recording process. The DIT ensures that the digital images is correctly recorded, secured, checked and prepared for further processing.

The main tasks of a DIT at a glance:
Image control and quality assurance
  • Monitors camera settings (e.g. exposure, color balance, contrast)

  • Work closely with cameramen and the director to achieve the desired look

  • Uses professional monitors and color assessment tools

Data management
  • Transfers footage securely from storage media to backup hard drives

  • Create multiple redundant backups (at least 2-3 copies)

  • Organizes the data structure for the smooth running of post-production

Preparation for post-production
  • Converts the raw material into suitable formats if necessary (transcoding)

  • Created Proxies (smaller versions of the recordings) for editing

  • Documents all metadata and, if necessary, on-set notes for the editors and colorists

Look management
  • Creates so-called LUTs (Look-Up Tables) that simulate a desired image look

  • Ensures that the artistic look is visible and assessable on set

Why is a DIT so important?

A DIT is the Link between camera department and post-productionHe guarantees that all recordings technically flawless, securely stored and organized A DIT is indispensable, especially for high-quality productions such as concerts with many cameras.

An Easyrig is a portable camera support system that helps cameramen to safely and securely hold heavy cameras during longer shoots.

Briefly explained:

An EasyRig looks like a backpack with a boom arm above your head. The camera hangs from this arm via a cable system with dampening. This distributes the camera's weight across your back and hips, rather than just your arms or shoulders.

Areas of application:
  • Documentary films

  • Reports

  • Music videos

  • Events & Live Streams

  • advertising shoots

An Easyrig is ideal, especially for situations where there is a lot of movement, handheld camera settings or longer takes (e.g. interviews).

Advantages of an Easyrig:
  • Relieves back, shoulders and arms

  • Longer shooting times without fatigue

  • Stabilizes the camera for smooth movements

  • Compatible with gimbals, DSLRs, cine cameras, etc.

  • Different models depending on the camera weight (e.g. Easyrig Vario 5, Minimax)

Important:

An EasyRig doesn't replace a gimbal or Steadicam—it stabilizes not electronically, but mechanically through load relief. However, many cameramen combine it with gimbals (e.g., Ronin 2 or Movi Pro) to get the best of both worlds.

One Editor – or also Editor In post-production, the editor is the person who edits the raw footage (all shot scenes) into a finished film or video. An editor ensures that many individual clips become a compelling, well-told story.

What exactly does an editor do in post-production?
1. Review of the material
  • Watch all the raw footage (often many hours).

  • Select the best takes and moments.

2. Editing & Assembly
  • Put scenes together in the correct order.

  • Determines the tempo, rhythm and transitions.

  • Pay attention to connection errors (continuity in movement, clothing, light, etc.).

  • Often works from a storyboard or script – but with creative freedom.

3. Sound editing
  • Cuts dialogue, sound effects, music and atmosphere neatly together.

  • Removes annoying noises or volume differences.

  • Sometimes also a first simple sound mix before the sound design takes over.

4. Inserting music and effects
  • Uses music purposefully to enhance emotions.

  • Adds transitions, fades, text panels or visual effects (depending on the project).

5. Collaboration with the director
  • An editor implements the director's vision – but also contributes their own creative ideas.

  • There are often several cuts until everyone is satisfied (e.g. Rough Cut → Fine Cut → Final Cut).

Goal of an editor:

Engage the audience emotionally, tell a clear story and create good transitions.

A Embedded code is a piece of code (usually HTML, JavaScript, or similar) that is embedded into a website or application to display or integrate external content or functionality. This code allows content such as videos, maps, forms, social media feeds, or widgets to be integrated directly into an existing page without having to host the content on your own server.

An example of embedded code is the YouTube embed code, which can be used to display a YouTube video directly on a web page. The code is simply inserted into the page's HTML code, and the video is then displayed on the page without the visitor having to visit YouTube directly.

In short:

Embedded code allows you to embed external content or functions into your own website or application.

Encoding and decoding simply explained

encoding means that a video or audio signal compressed and converted into a digital format so it can be streamed or stored. For example, the camera delivers the raw image, and an encoder turns it into a live stream for YouTube or Facebook.

Decoding is the reverse process: The compressed signal is made readable again, so “decrypted”so that it can be correctly displayed or further processed on a screen or in a control room.

Example:
  • Encoding: Camera → Encoder → Livestream

  • Decoding: Livestream → Decoder → Screen or control software

In short:

Encoding makes the video broadcastable, Decoding makes it visibleBoth processes are essential for smooth live streaming.

A EVS Operator:

works in live TV productions and is responsible for the Recording, playback and editing of video material in real time – e.g., for Slow motion, replays or highlight compilations.

A EVS system:

is a professional video server designed specifically for Live TV productions It enables the simultaneous recording, playback and editing of video material in real time – ideal for Replays, slow motion and highlight clips at sporting events, shows or concerts.

A Flight case is a robust carrying case, which is specially built to protect sensitive equipment safe and shockproof to transport – especially in event, music, film and media technology.

What is a flight case used for?
  • Lighting and sound equipment (e.g. mixing consoles, microphones, cables)

  • Musical instruments (e.g. guitars, synthesizers)

  • Camera technology & lenses

  • Monitors, servers, computers

  • Trade fair & presentation equipment

Advantages:
  • Protection during transport (including by plane, hence the name)

  • Durable and reusable

  • Professional look & organization

Fiber optics:

(also Fiber optic cable or in English fiber optics) is a modern technology for Data transmission with light – super fast, stable and future-proof.

Fiber optic is a thin thread of quartz glass or plastic through which light signals are sent to transmit data at high speeds—much faster than with copper cables.

A Green screen is a green background area, which in the Film, video and photo production used to digitally replace the background. This process is called Chroma keying.

How does a green screen work?
  1. Subject is shot against a green background

  2. In the post production the green color keyed out

  3. The green area is marked by a any image, video or digital environment replaced
    → e.g. weather maps, fantasy worlds, virtual sets

Why green?
  • Green is a color that hardly occurs in human skin, which makes filtering out easier

  • Camera sensors are particularly sensitive to green (cleaner separation)

  • Alternatively, Blue be used → Blue screen (e.g. with green clothing)

A hybrid event is an event format that combines physical and digital participation.

This means that part of the audience is present on site, while another part participates remotely via video transmission (livestream) or via digital platforms (e.g. Zoom, YouTube Live, MS Teams).

Conclusion:

A hybrid event is an event that takes place simultaneously in-person and online, with content streamed live via video so that both physical and virtual attendees can participate and interact in real time.

Interactive online shopping refers to an online shopping experience in which shoppers are actively involved in the decision-making process, often with features that enable direct interaction with the website or app. Unlike traditional online shopping, where shoppers passively browse and purchase products, interactive online shopping offers additional features that personalize the user experience and make the purchasing process more engaging.

Features of interactive online shopping:
  • 360-degree views and zoom:
  • Interactive features such as 360-degree views or zooming allow shoppers to view products from all angles and examine details, simulating physical shopping in a store.

Live shopping events and social commerce:
  • At Live shopping events Customers can interact in real time with presenters, product experts, or influencers who present products and answer audience questions. Users can shop immediately while following the presentation.

  • Social Commerce integrates interactive features into social media so users can purchase products directly through platforms like Instagram, TikTok, Facebook, LinkedIn, or their own websites while browsing content.

Intercom systems:

(also Intercom) are Communication systems, which all team members of a video production – e.g. director, cameramen, sound and technology – in real time via headset can communicate.

Why are they important in the video industry?

They enable a fast, clear and coordinated communication, which at Live productions, complex shoots or live streams is crucial. Without intercom, precise camera movements, spontaneous direction, or smooth workflows would hardly be possible.

A Camera dolly is a mobile camera car systemthat allows the camera to move evenly and in a controlled manner, e.g., sideways, forwards, backwards or in a circle. It is often used on Tracks or special wheels moves and is used in professional film, TV and advertising productions.

What does a camera dolly do?

A camera dolly enables:

  • smooth camera movements without shaking

  • Dynamic perspective changes during recording

  • High-speed chases of people, cars, etc.

  • smooth movements in the studio or on set

Construction of a camera dolly:

A classic dolly consists of:

  • Chassis (with wheels or rail rollers)

  • Seat for the cameraman or dolly grip (optional)

  • Stage for tripod, tripod head or camera arm

  • Rails (tracks) for precise linear movements

  • sometimes with hydraulic or electric drives

Possible uses:
  • Linear camera movements (e.g., dialogue scenes)

  • Circular movements around a motif

  • Push-ins (camera moves towards the subject)

  • Pull-outs (camera moves away from the subject)

Conclusion:

A Camera dolly is ideal for controlled, quiet rides, especially in professional productions. It ensures high image quality and cinematic camera movements – a classic on film sets.

One LED Wall (or LED wall) is a large digital display area consisting of many small LED modules (Light Emitting Diodes). These modules light up in different colors and together form a Image or video in real time away.

What makes an LED wall special?
  • Bright, brilliant image – clearly visible even in direct sunlight

  • Large areas possible – from a few square meters to entire house facades

  • Seamless display – unlike composite monitors

  • Flexible use – indoor and outdoor, mobile or permanently installed

A live chat is a real-time communication option that allows users to communicate directly with a support team, person, or system via text messages. This service is often used on websites or in apps to provide quick help or answers to questions.

Live chats are used in various contexts, such as:
  • Interactive entertainment: During live streams or online events, viewers can ask questions or make comments that are answered in real time.
  • Customer Support: Users can speak to a representative immediately for assistance with technical questions, orders, or issues.

  • Sales advice: Customers can chat with a salesperson in real time to get information about products or services.

Compared to traditional communication channels such as emails or phone calls, live chat is faster and more direct, as responses are provided almost immediately.

The Live direction In a video broadcast, live direction refers to the process of real-time control and coordination of all visual and audiovisual elements during a live stream or live broadcast. The role of live direction is to ensure that what viewers see is well presented by mixing and controlling various cameras, graphics, videos, and audio sources in real time.

Important aspects of live directing:
  1. Camera change: The live director coordinates which camera angle or camera is shown at a specific time. If multiple cameras are in use, the director ensures that the correct angle is shown at the right time to optimally capture the action.

  2. Graphics and overlays: Graphics are often overlaid, such as names, statistics, logos, titles, or text. The live director inserts these graphic elements into the stream and controls when they appear and disappear.

  3. Sound mixing: In addition to the picture, the live director also ensures that the sound is mixed correctly. This means harmonizing microphones, music content, and other audio sources and adjusting volume levels if necessary.

  4. Transitions and Effects: During the broadcast, the live director uses transition effects, such as fades or wipes, to create smooth and engaging transitions between scenes or camera angles. This makes the broadcast professional and visually interesting.

  5. Direction: The director or live control person gives instructions to the team to ensure all elements are well coordinated. This includes controlling cameras, timing graphics, fading videos, or reacting to unexpected events.

Live directing tools:
  • Video switcher: A device or software that allows switching between different cameras and video sources in real time.

  • Teleprompters and stage announcers: To help speakers or presenters follow their lines or instructions, especially during live news broadcasts or shows.

  • Audio mixer: A device responsible for mixing various audio sources to ensure the sound is clear and balanced. This position is often supplemented by an experienced sound engineer.

Applications of live direction:
  • Television broadcasts: In live news broadcasts, sports broadcasts or live events, precise and fast live direction is necessary to present the action smoothly and professionally.

  • Event streaming: At large events, concerts, or online streaming events, live direction is often used to combine the various visual and auditory elements and ensure a smooth presentation.

  • Live productions: Live productions such as theater performances, conferences, or seminars also require live direction to manage all technical aspects of the broadcast.

Conclusion:

The Live direction is crucial to ensuring a high-quality and engaging video broadcast. It ensures that all visual and audiovisual elements are perfectly aligned in real time, providing viewers with a professional and seamless experience.

One Live broadcast is a Real-time video or audio connection between two or more locations, used to share content or conversations to broadcast live – i.e. immediately at the time of the event.

The term originally originated in radio and television, but is now also used in companies, at events, and in the digital environment. A live broadcast can be one-way (broadcast only) or two-way (dialogue).

Typical areas of application:
  • TV broadcasts: e.g., live feeds to reporters on site

  • Corporate communications: e.g., addresses from management to multiple locations

  • Events & Features: e.g., live connections from guests, speakers, or experts

  • Politics & Conferences: e.g., remote participation of delegations or participants

Technically speaking:

During a live broadcast, the Video and/or audio signal transmitted in real time via digital networks (e.g., satellite, internet, fiber optic, or mobile). Commonly used are:

  • Live streaming platforms (e.g. YouTube Live, Vimeo)

  • Video conferencing systems (e.g. Zoom, MS Teams)

  • Broadcast technology (e.g. SNG transmissions on TV)

Livestream:

Live streaming refers to the transmission of audio and video content in real time over the internet. Events, shows, news, games, or other activities are broadcast live, allowing viewers to watch them immediately without having to wait for a later recording.

Livestreaming is often used on platforms like YouTube, Twitch, Facebook, or Instagram. It is used for a wide variety of purposes, from entertainment formats and news broadcasts to interactive events where the audience can communicate with streamers or moderators in real time.

A Patch panel:

A patch panel (also called a patch panel) in video technology is a central connection and switching point for cable connections – especially for video, audio, and control signals. It is used to flexibly organize, reroute, and manage signal paths – without permanent rewiring.

Pay-per-view:

Pay-per-view (PPV) in video and livestream production is a payment model where viewers pay a one-time fee for access to a specific livestream or video-on-demand content.

Access is usually limited by time or usage – e.g., for the duration of the live stream or for a fixed period of time for on-demand content.

Post-production refers to the process that takes place after the actual shooting or production of a film, video, or audio content. It encompasses all the steps necessary to edit and refine the raw material before it is published or distributed.

Post-production covers several important areas, including:
  1. Editing: The process of assembling and editing the recorded footage to achieve the desired structure and narrative.

  2. Sound editing: This includes mixing and editing sound recordings, adding music, sound effects, and dubbing.

  3. Visual Effects (VFX): Computer-generated imagery (CGI) or other visual effects added to enhance the visual experience.

  4. Color correction: Adjusting the colors in an image to create a certain atmosphere or enhance the material.

  5. Titles and graphics: Add text, logos or animated graphics.

Post-production is a crucial step in transforming raw footage into a finished, professional-looking product. The process is used in film production, music production, video production, and many other media content.

The term Pip stands for Picture-in-Picture – in German something like: “Picture-in-Picture”This is a technique used in video and livestream production, where multiple video sources can be displayed simultaneously in one image, with one or more displayed smaller.

What is Picture-in-Picture (PiP)?

Picture-in-Picture is a layout where:

  • a main image (e.g. the presenter or a music video) takes up most of the screen,

  • one or more smaller windows (e.g. band members, interview partners, presentations) displayed simultaneously – often on the edge or in a corner.

Typical areas of application:
  • Live streaming of interviews (e.g. moderator in the main picture, interviewee in the PiP window)

  • Gaming streams (Gameplay large, streamer camera small)

  • Presentations or webinars (PowerPoint slide large, speaker small in the picture)

  • TV broadcasts (e.g. commentator window during sports broadcasts)

  • Conferences & hybrid events (multiple speakers visible at the same time)

Advantages of PiP in live streaming:
  • More interaction and dynamics in the stream

  • Simultaneous display of multiple perspectives

  • Professional look (like on TV)

  • Ideal for decentralized productions (like your Metallica switch)

PTZ cameras are special video cameras that remote controlled can be done in three directions:

PTZ = Pan – Tilt – Zoom
  • Pan → swivel sideways (left/right)

  • Tilt → tilt (up/down)

  • zoom → zoom in or out

PTZ cameras are motorized cameras that can be moved and controlled remotely – ideal for events, live streams, conferences, church services, theater, sports broadcasts and much more.

A recording Recording refers to a recording of audio, video, or both. It is the process by which sound or image material is captured and stored, whether for music, podcasts, interviews, films, conferences, or other formats.

In the audio field, a recording can be a piece of music, a podcast, or a voice message. Video recordings are the capture of moving images, such as films, tutorials, or conferences.

The recording is the technical step in which the content is created and stored on a medium such as hard drives, CDs or in the cloud.

One Director In video production, the creative director is the person behind the project—the visionary behind the project, so to speak. He or she ensures that the story, mood, and style of the film or video are realized as planned (or dreamed).

What exactly does a director do?

1. Developing an artistic vision

  • Decide how the story should be told – in sound, image, rhythm and emotion.

  • Work closely with writers (if there is a script).

  • Determines the visual and narrative style (e.g., dark, comedic, dramatic).

2. Leads the team

  • Specifies instructions:

    • Cameramen (how should the filming be done?)

    • Actors (how should they act, feel, react?)

    • Sound, lighting, costume and production designers

  • Coordinates all creative areas to realize the vision.

3. Work with the actors

  • Guides them through their roles.

  • Provides feedback on gestures, facial expressions, speed, emotions.

4. Supervises the filming

  • Is in every scene.

  • Decides whether a take is “good” or needs to be repeated.

  • Pay attention to details like timing, mood, and connection errors.

5. Help shape post-production

  • Collaborate with editors on editing.

  • Provides feedback on music, sound design, color grading.

  • Help decide on the final version of the video.

Goal:

A coherent, emotionally powerful end product that draws viewers into the story – whether it’s a music video, short film, commercial, or feature film.

One Remote video production refers to a video transmission or recording in which the production members involved are not in the same physical location. Instead, various parts of the production (such as cameras, sound, editing, and directing) are connected via the internet or other networks and controlled in real time.

In a remote video production, individual team members or even the entire production can work from different geographical locations. This is made possible by the use of modern technologies such as cloud-based software, network technology, and specialized hardware.

Features and components of remote video production:
  1. Distributed teams: In a remote production, camera operators, directors, sound engineers, and editors may work in different locations, but their work is coordinated in real time. This can be especially useful for live broadcasts or large, international events.

  2. Cloud-based tools and software: Many remote productions use cloud services like vMix, OBS Studio, Wirecast, or specialized platforms like LiveU or Airtime. These tools allow videos to be streamed, edited, and cut while being accessible over the internet.

  3. Real-time data transmission: The camera images and audio signals are sent via networks to a central control center, which monitors, processes, and broadcasts the transmission in real time.

  4. Communication between teams: Teams use communication tools such as intercom systems, Zoom, Skype, or specially developed solutions to ensure that all production members are well connected and can coordinate their tasks.

  5. Virtual Direction: Directing is often done remotely, with the director monitoring and controlling the various video sources (e.g., cameras, graphics) in real time, similar to a traditional live production, but without being physically present on location.

Advantages of remote video production:
  1. Cost savings: Since no travel or large on-site production teams are required, remote productions can save costs on staff, travel, accommodation, and equipment.

  2. Flexibility: Production members can work from anywhere, which is particularly advantageous in a globalized world. It also allows for a wider selection of talent and experts, regardless of their geographic location.

  3. Faster response times and agility: Remote productions enable rapid response to changes or new requirements because they are often based on flexible, scalable software solutions.

  4. Reaching larger audiences: Especially for large live events, content can be streamed and edited worldwide through remote productions, reaching a broader and more global audience.

Applications of remote video production:
  • Live streaming of events: Sports events, conferences, webinars, and music festivals can be streamed live, with the production team monitoring and controlling the event remotely.

  • News broadcasts: News teams can compile and broadcast stories in real time from different parts of the world without all reporters and technicians being physically in one location.

  • Corporate productions: Companies can produce their video content (e.g., for training or marketing) remotely and use resources efficiently.

  • Film and TV productions: This technology is also used in films or series productions where certain scenes or parts of the production are edited or controlled remotely.

Conclusion:

Remote video production leverages modern technologies to enable high-quality video creation and delivery without requiring everyone to be in the same location. It offers high flexibility and cost-saving benefits, making it especially important in today's digitally connected world.

One Sound Designer is responsible for the post-production acoustic design a film or video.

While an editor works with the visual material, a sound designer shapes the Sound worldthat makes the story seem emotional, realistic or even magical.

What exactly does a sound designer do?

1. Create and incorporate sound effects

  • Adds realistic sounds such as footsteps, doors, rain, car noises – often recorded as Foley sounds.

  • Use sound effects to emphasize or exaggerate things (e.g. sci-fi sounds, explosions).

2. Building soundscapes

  • Creates atmospheres: forest, city, subway, abandoned spaces.

  • Ensures that every scene sounds spatial and atmospheric.

3. Dialogue editing

  • Cleans and optimizes voice recordings (noise reduction, volume adjustment).

  • Adapts dialogue to environments to make it seem “real.”

  • Edits ADR (post-dubbed dialogue) if necessary.

4. Creative design

  • Develops completely new sounds for, for example, fantasy or sci-fi worlds.

  • Use distortion, layering, reverbs or synthesizers to create unique sounds.

5. Sound mix & finalization (mixing)

  • Mix all audio tracks (dialogue, music, sounds) so that everything is clearly audible.

  • Works closely with composers, sound engineers and directors.

Goal of a sound designer:

Bring the film to life audibly. The sound adds depth, emotion, tension, and realism.

One Steadicam is a camera stabilizing system, which enables cameramen to take smooth and shake-free shots – even while moving.

Simply explained:

Instead of holding the camera directly in the hand (which quickly leads to camera shake), it hangs on a mechanical frame, which is attached to a vest worn by the cameraman. A sophisticated Spring and counterweight system compensates for movements such as walking, running or climbing stairs.

Advantages of a Steadicam:
  • Smooth camera movements without wobbling

  • Dynamic perspectives (e.g. through corridors, between people, across stages)

  • No tripod needed, but still stable

  • Professional look, as we know it from cinema productions

Where is it used?
  • At live events such as concerts or conferences (e.g. re: publica)

  • In film and television

  • During sports broadcasts

  • In music videos and commercials

Tally radio systems show camera personnel and presenters via illuminated display (mostly red), which camera is currently live is.

Why are they important in the video industry?

They ensure clear orientation during a live production. This way, cameramen know when they are “on air,” and presenters can speak directly into the active camera – this makes the production more professional, efficient and error-free.

A Teleprompter is a device that Text reflects, so that moderators or speakers can use it directly from the camera's perspectivewithout taking your eyes off the camera.

This is how a teleprompter works:
  • A Screen shows the text (e.g. a script or bullet points).

  • One semi-transparent glass pane mirrors this text towards the person in front of the camera.

  • The Camera films through the glassso that the audience does not see that the reading is taking place.

Why is a teleprompter important in video production?
  • He helps, to speak fluently and confidentlywithout having to memorize.

  • Eye contact with the camera is maintained → professional appearance.

  • Particularly useful for Live broadcasts, News, commercials or moderated shows.

In short:

A teleprompter is the invisible reading device for professionals in front of the camera – it ensures clear speech, fewer slip-ups, and more confidence.

What does transcoding mean in the video industry?

Transcoding In the video industry, refers to the process by which a digital video is transferred from a format, codec or quality is converted to anotherThe goal is to optimally adapt the video material to different output devices, platforms, or bandwidths.

Transcoding involves converting video files re-encoded – that is, they will first decompressed (decoded) and then with a different codec, bitrate or resolution compressed again (encoded). This creates a new file tailored to the specific requirements.

Typical use cases for transcoding:
  • Adaptation to different End devices (e.g. smartphone, tablet, smart TV, desktop)

  • Optimization for different Internet connections (e.g. Full HD, HD, SD, Low Quality)

  • Conversion into streaming-compatible formats (e.g. HLS or MPEG-DASH)

  • Creation of several quality levels for adaptive bitrate streaming

Practical example:

At a Live broadcast The original video feed is provided in several versions in parallel – for example in 1080p, 720p, 480p and 360p. A so-called Media server or cloud encoder automatically transcodes the incoming stream into these resolutions. Viewers then automatically receive the appropriate version depending on their device and network quality.

In short:

Transcoding makes video content flexible to use – on all devices, at any bandwidth and in the optimal quality.

A OB van (short for outside broadcast van):

is a mobile studio on wheels, used in film, television and video production for live broadcasts or on-site recordings.

Depending on its size and purpose, an OB van contains, among other things:

  • Editing and directing stations

  • Video and sound mixing consoles

  • Live monitoring systems

  • Intercom systems (communication)

  • Signal processing technology

  • Storage and recording systems

  • Satellite technology or fiber optic connection (for live broadcasts)

Under Video content refers to all types of content created and distributed in the form of videos. These can have a variety of formats and purposes, from entertainment and information to advertising and education. Video content can be published on platforms such as YouTube, Instagram, x, Facebook, Vimeo, social media, or even on websites.

Examples of video content include:
  • Explainer videos: convey complex topics in a clear way.

  • Tutorials: Show viewers how to do or use something.

  • Promotional videos: Used to build branding or promote products and services.

  • Vlogs (video blogs): Personal or topic-specific video diaries.

  • Live streams: Live broadcasts of events or interactive content.

  • Entertainment: movies, series, music videos or comedy clips.

Video content is particularly effective because it has both a visual and acoustic impact and can generate a high level of interactivity and enthusiasm.

One Videographer A videographer is someone who is professionally or artistically involved in video creation. They plan, film, edit, and produce moving image content—often on a small or medium scale.

What exactly does a videographer do?
Typical tasks:
  • planning and conception (e.g. script, storyboard)

  • Camera work and sound recording

  • cut and post production (color correction, music, effects)

  • publication or handover to customer

A videographer often works alone or in small teams and takes on several roles simultaneously – in contrast to larger film productions where camera, sound, editing, etc. are distributed among several people.

One Video technician (or Video technician) is a professional who deals with the technical side video production, playback or transmission – in contrast to videographers, who are usually creative-work creatively.

Typical tasks:
  • Construction, setup and maintenance of video equipment:

    • cameras

    • Monitors

    • Vision mixer

    • Video walls

    • Projectors

  • Signal transmission check and ensure (SDI, HDMI, NDI, etc.)

  • Live direction or collaboration on Multi-camera productions

  • Video recording and playback

  • Dealing with Encoders / Streamers for example, live streams

  • Troubleshooting in case of technical problems

One video direction is the area or facility where the technical control of a video production takes place. It is the central location where all audiovisual material is monitored, edited, and mixed in real time to produce the final video or live broadcast. The video director handles a variety of tasks, from image and sound mixing to camera control and graphics integration.

Tools and equipment in video directing:
  • Video switcher: A device or software that allows the video director to switch between different video sources.

  • Sound mixer: A device that mixes and processes different sound sources in real time.

  • Graphics systems: Systems that integrate text, logos, animations, or other visual elements into the video.

  • Monitor walls: A collection of screens that allows the director to monitor all camera images and sources simultaneously.

  • Direction communication systems: Systems that allow the video control team to communicate with cameramen, presenters, and other participants.

Difference between video directing and live directing:

Although the term "video direction" is often used in the context of live broadcasts, there is a subtle distinction: Video direction focuses on image and sound management, while live direction deals with the overall coordination of the production, stage direction, and scene planning. The live director thus provides the overall instructions, while the video direction handles the technical implementation of these instructions.

Conclusion:

Video direction is the technical heartbeat of a video production. It ensures that all visual and audiovisual elements are edited and combined in real time to deliver a high-quality final product. It is a key component in many areas, including TV broadcasts, live streams, film productions, and events.

Video on Demand (VOD):

refers to a service where users can watch videos at any time and at their convenience, without being tied to fixed broadcast times. Unlike traditional television programs, where content is broadcast at fixed times, VOD Access to films, series, documentaries or other video content at any time and from various devices such as smartphones, tablets, computers or smart TVs.

VOD has revolutionized the way we consume media by offering flexibility and a huge selection of content.

A MediorNet is a modular, decentralized real-time media network, which includes all important production signals – Video, audio, intercom and control data – transported together over a single fiber optic network. The term originates from the manufacturer. Riedel Communications, who developed MediorNet as a professional infrastructure system for broadcast, events, trade fairs and live productions.

A MediorNet connects several so-called Nodes (Node). Each node can send signals receive, process, route and output again. This allows MediorNet to replace many individual signal paths (e.g., SDI cables, audio lines, or intercom links) with a central and flexible network structure.

What makes a MediorNet special?

 – Real-time routing

All signals travel across the network without any noticeable delay.

 – Decentralized structure

The work is not done by one large router, but by many small units working together.

 – High reliability

The system operates redundantly. If one connection fails, another automatically takes over.

 – Integrated signal processing

MediorNet can convert, distribute, synchronize or split signals directly within the network – without additional hardware.

 – Flexible expandability

New nodes can be added at any time, and the network grows with production.

What is MediorNet used for?
  • Distribution of camera signals

  • Video playback on LED walls

  • Integration of intercom systems

  • Transmission over long distances

  • Live events, sports productions, trade fairs and industrial productions

  • Temporary directors and mobile productions

In short:

A MediorNet is the backbone of modern live productions and replaces many individual devices with an intelligent, robust fiber optic network.

One abdominal bandage (also known as Lower Third A video caption is a graphical representation that appears at the bottom of the screen. It typically shows important information such as a person's name, the title of a show or story, a person's position, or other relevant details.

Lower thirds are often used in news broadcasts, interviews, talk shows, documentaries, and live streams to give viewers additional information without disrupting the visual flow of the video.

Example:
  • In an interview, a lower third might display the interviewee’s name and position: “John Doe – CEO, Sample Company.”

  • For a news story, it could show the title of the story or the source of the information.

Lower thirds are a useful tool for conveying information clearly and attractively without distracting attention from the main content.

HD stands for High Definition and refers to a higher resolution of image and video formats compared to older standard definition (SD) formats. HD provides clearer, sharper, and more detailed images and videos, which is especially noticeable on larger screens.

There are two main types of HD resolutions:
  1. 720p (HD Ready):

    • Resolution: 1280 x 720 pixels

    • This was the first stage of HD and is often referred to as "HD Ready." It offers a significant improvement over the standard definition (SD) format and is still common for smaller TVs and in certain applications such as streaming on mobile devices.

  2. 1080p (Full HD):

    • Resolution: 1920 x 1080 pixels

    • Full HD is the most common HD resolution and is widely used on modern televisions, computer monitors, and streaming platforms. It offers even sharper picture quality compared to 720p and is the standard for many media content such as Blu-ray Discs and video games.

Features of HD:
  • Sharper images: Compared to standard definition (SD), which typically has a resolution of 720 x 480 pixels, HD delivers much sharper images and more detail.

  • Better colors and contrast: HD formats also offer better color fidelity and more accurate contrast, resulting in more realistic and vibrant picture quality.

Use:

HD formats are widely used in various areas:

  • Television: Most modern television channels and streaming services offer content in HD quality.

  • Blu-ray and digital movies: Blu-ray discs typically offer 1080p video, which provides better quality compared to DVDs.

  • Gaming: Many game consoles and computers support 1080p for better visuals.

  • Streaming: Platforms like Netflix, YouTube, and Amazon Prime Video offer a variety of HD content.

In summary:

HD is a format that is designed for High Definition and offers high image resolution. It includes 720p (HD Ready) and 1080p (Full HD) and has become the standard in the digital image and video world.

4K denotes a resolution of approximately 4000 pixels in the horizontal axis, which offers significantly higher image quality compared to previous resolutions such as Full HD (1920 x 1080 pixels). It is a term used primarily in reference to televisions, monitors, cameras, and films.

The exact resolution of 4K varies by standard, but the most common is 3840 x 2160 pixels, also known as Ultra High Definition (UHD). For cinema, 4K is often defined as 4096 x 2160 pixels.

Advantages of 4K:
  1. Sharper images: The high resolution renders details much clearer and sharper. The difference compared to lower resolutions like Full HD is particularly noticeable on larger screens or with highly detailed content.

  2. Better color representation: 4K content can often also be combined with better color depths and more color spaces (e.g. HDR - High Dynamic Range), which improves the overall visual experience.

  3. Future-proof: As more and more content is produced in 4K, it offers a good future perspective for enjoying the latest media formats in the highest quality.

Applications of 4K:
  • Television and streaming: Many modern TVs support 4K, and platforms such as Netflix, Amazon Prime Video and YouTube already offer 4K content.

  • Cameras and photography: 4K is also popular with video cameras and modern smartphones, which can film in this resolution to produce films or videos of the highest quality.

  • Computer monitors and gaming: 4K monitors are becoming increasingly popular in gaming and professional image editing. They offer better clarity for demanding applications.

Summary:

4K is a standard for very high screen resolution, offering four times more pixels than Full HD and delivering razor-sharp image quality. It is increasingly used in many areas, including television, cinema, photography, and gaming.

5G is the latest mobile network that enables faster internet, lower latency, and more simultaneous connections—perfect for streaming, gaming, smart cities, autonomous driving, and more.

5G is the fifth generation of mobile communications – the successor to 4G/LTE – and brings significantly faster, more stable and smarter mobile connections.

Where is 5G used?
  • Smartphones & Tablets – for ultra-fast mobile internet

  • Autonomous driving – vehicles communicate in real time

  • Industry & Robotics – Machines networked and controllable in real time

  • Medicine – e.g. remote surgery or smart diagnostics

  • Augmented & Virtual Reality – e.g. in gaming or training

  • 5G is already available in many cities, and is still being expanded in rural areas

  • 5G uses new frequency ranges, e.g., at 3.6 GHz and, in the future, millimeter waves (for even more speed)

Major Festivals:

These are significant, often world-renowned events held in specific fields such as music, art, film, culture, or religion, typically attracting large numbers of visitors from different regions or even countries. These festivals are often annual or periodic and have a significant impact on culture and social life. They provide a platform for outstanding artists, performances, and cultural expressions.

The 9:16 format is a Portrait formatAspect ratio, which means that the width of the image or video is 9 units and the height is 16 units. This format is the opposite of the traditional 16:9 format, which is typically used for televisions, computer monitors, and many video formats, and is more suitable for content viewed on mobile devices such as smartphones or tablets.

Features of the 9:16 format:
  • Portrait: The 9:16 format is vertically oriented, making it ideal for displaying content on mobile devices, which are typically held in portrait orientation.

  • Popular on social media: Platforms like Instagram Stories, TikTok, Snapchat, and YouTube Shorts often use the 9:16 format to display videos and images because users typically hold their phones vertically.

  • Video and photo: It is used for both photos and videos, especially for content that is specifically optimized for mobile devices and social media.

Advantages of the 9:16 format:
  1. Optimization for smartphones: Since smartphones are usually held in portrait format, the 9:16 format is perfect for viewing and creating content on mobile devices.

  2. Better user experience: Users can consume content without rotating their device, improving the user experience on mobile devices.

  3. Social media engagement: The 9:16 format is particularly popular on social networks and short video platforms because it uses the entire screen area of the phone, thus intensifying the visual experience.

Example applications:
  • Instagram Stories: The typical stories on Instagram have the format 9:16.

  • TikTok videos: TikTok videos are also in 9:16 format and offer a vertical display of content.

  • YouTube Shorts: YouTube Shorts, the platform's short video format, also uses the 9:16 format.

Conclusion:

The 9:16 format It's ideal for displaying content on mobile devices and social platforms, as it perfectly utilizes the vertical orientation of a smartphone. It's especially popular for short videos and stories on social media, providing a better user experience on mobile.

The 16:9 format is a Aspect ratio, where the width of an image or video is 16 units and the height is 9 units. It is the Standard format for most modern TVs, computer monitors, and many video formats, including most YouTube videos and movies. The 16:9 format is horizontal aligned, which means it is wider than it is tall.

Features of the 16:9 format:
  • Horizontal: The 16:9 format is designed for horizontal display of content and is ideal for movies, TV shows, and video content.

  • Standard format for HD and 4K: The 16:9 aspect ratio is the standard format for high definition (HD) and ultra high definition (4K) content. This means that most modern TVs and computer monitors use this aspect ratio.

  • Compatibility: The format is used on platforms such as YouTube, Netflix, Hulu, and other video streaming services. Blu-ray discs and most digital cameras also use this format.

Advantages of the 16:9 format:
  1. Widely used: It is the most widely used format for movies, TV shows, streaming services, and also for many online videos.

  2. Visual balance: The 16:9 aspect ratio offers good visual balance by providing enough width for landscape shots, action scenes, and details, while still leaving enough height for faces and vertical elements.

  3. Ideal for widescreen TV and cinema: The format is well suited to presenting films in widescreen format and is also widely used in professional video production.

Typical applications:
  • TVs and monitors: Most modern TVs, computer monitors, and laptops have a 16:9 display.

  • Movies and TV shows: Many movies and TV shows are produced and shown in a 16:9 format, especially in HD.

  • YouTube videos: Most YouTube videos and online content are in 16:9 format.

  • Video games: Many video games are developed for 16:9 screens to provide a wide and immersive gaming experience.

Examples of resolutions in 16:9 format:
  • HD (High Definition): 1280×720 pixels

  • Full HD (1080p): 1920×1080 pixels

  • 4K Ultra HD: 3840×2160 pixels

Conclusion:

The 16:9 format is the standard aspect ratio for video and screen displays, providing a wide, horizontal view ideal for many applications such as television, movies, video games, and online streaming. It is particularly widespread and is used on most modern devices.