My Directing Tips while Filming an Interview

Experience is key in filmmaking. As a director, you need to be in control of the artistic and technical aspects of the screenplay. In this case, we are in an alive interview called “More Than Violin”. My subject Gözde Sevdir, violinist, tells about her story after she comes to Brussels. I need to find the best ways to film Gözde and make connections with her. But how should I do that?

Feel It

When the camera starts rolling, remember two things: first, you are in control of the technical aspect of the film. Which means that you need to stand as close to the screen as possible. Try finding the best elements to frame differently. But don’t try to use different shutter speeds or gain levels. You need to know the limits of your camera beforehand. Base yourself on the other people’s experiences and use the standard recommendations in the technical points.

Secondly, make emotional contact with your subject. Especially during the interview, your relationship with the interviewer is very important. Put yourself into your guest’s shoe, you would like to know what’s going on right? Try to feel what the camera is recording and how your subject feels about it.


Bahar Elif Akyuz films “More Than Violin”

As Director, you must not only decide on the placement of the camera but also you must spend some time explaining exactly what you want to record now. Don’t forget to talk to your guest. It will make the shooting smoother.

Gözde Sevdir, Violinist, in the studio for “More Than Violin” Interview

Trust Your Crew

You’d be surprised how much your crew wants to do their bests even though it is the Director’s creation in the first place. Enthusiastic super team is a team who likes to create something special together. The trust will allow your crew to be more helpful.

For me conducting a crew means, allowing your crew to share their opinions. For example: while you are so much concentrated on how the lights make an effect on the screen, ask your DP to get a better sense of where to place the lights. Don’t hesitate to ask again and again until you get the best feeling of where to fix the lights.

We all know that the human eye and camera iris doesn’t receive the same amount of color information. When it comes to light, be in charge of what you really want to take out of it. That’s why you should allow your crew to help you.

Vital Coordination with your Sound Engineer

Filmmaking is, after all, a technical art. Although, a director could easily leave the technical understanding entirely to another crew member. Personally, I suggest you better be knowing some essentials. Every role behind the scene is vital. But understanding the technical vocabulary of each role of your crew members is vital. Imagine, you record film images without a sound. All the work goes to the trash. Therefore, listen to the needs of your sound engineer.


Keep this secret sentence in your mind and use it. Press recording and shout “Camera turns”. Then get a confirmation from your sound engineer who says “sound turns”.

Focus vs blur

While recording your subject, always make sure of your focus adjustments. Look at your rushes after your test shot. But always do a zoom in before setting the focus. This is the difference between photography and video. You need to be in control of your image. Don’t compromise on your creative expectations. Sometimes a blur reflects the deep emotions of your subject. But sometimes the opposite. The worse is when you forget arranging your focus and instead of getting a sharp movie you end up with a blur.

Do compromise only if it aligns with your inspiration

We all know that art and style come with restrictions. It is hard to keep your own bar down. However, you don’t have unlimited resources, time or money while filming. As soon as your inspirational expectations are satisfied you should know where to stop.

Being Lighting Director in the set of Filming the ‘Poetry’

As a Lighting Director I was in the set of Ilja Sircenko’s interview with Maria Alkova from St.Petersbourg. Briefly, for Maria poetry is a passion. She wanted to share her love with us through her own written poem.

Director Ilja Sircenko in “Poetry” film set
Poet Maria Alkova reads her own poem

The light preference of Ilja was dominantly on blue color in order to give the reflect on snow and cold weather of Russia. The background lighting is also selected based on that personal choice.

In the ‘Poetry’ interview we worked as a team of five.


Daniel Colin (Director of Photography), Pascaline Crevecoeur (Scripte)

What are the responsibilities of Lighting Director in the cinema?

In cinema, Lighting Director is responsible for setting up and operating equipment. This profession is also known as ‘gaffer’. This time, I oversaw rigging up lighting equipment, carrying out filter tests and positioning lights during the shoots of ‘Poetry’.

Camille Vercruysse (sound engineer).
Bahar Elif Akyuz (Light Director)

As we know studio lighting is all about creating an environment of control that you place your subject in. Isolating your subject from the fluctuating of light that creates shadows sometimes takes hours to fix. We preferred using three-point lighting for this interview.

What is a 3-point lighting ?

Three point lighting is a basic lighting strategy. It was a classical of Hollywood. The first light calls: The key light (attaque or la lumière clé in French). It is the direct light shines upon the subject. It serves as a principal lighting source. The key light is placed at a 30–60° angle.

Step by step 3 point Lighting

In outdoor daytime shots, the Sun often serves as the key light. But in our case the filming was inside. Therefore we needed to set the light in the exact position to shot the best capture the subject took a little bit of time.

The fill light (compensation in French) We compensate the shadows on the subject’s face by creating a second light. Fill light also shines on the subject, but from a side angle relative to the key light. It is often placed at a lower position than the key.

The back light (the rim, hair, or shoulder light) is also another way to compensate light. In French it is called dechrochage or contre. Generally, this light comes on the subject from behind. It gives the subject a rim of light, serving to separate the subject from the background. Depending on the level this source can highlight the contours of the shoulders.

What is chiaroscuro effect?

Takes its name from high contrast-dark style light used in Renaissance painting. Sometimes without fill light chiaroscuro effect appears by a person’s nose upon the rest of the face.

The preferred way to use it is just the level of the subject’s face. It helps to balance the key by illuminating shaded surfaces.

Colour Filters

To conclude, there are three basic types of gels when it comes to photo and video. Diffusion gels are used to diffuse light. If you want a soft light you can use those ones. Second group is corrector gels (CTB,CTO, +Green,-Green)

Bahar Elif Akyuz (Light Director) in the set of “Poetry”

In this particular shooting, we preferred both diffusion and corrector gels. This type of filters softens subjects’ image and generates a less dramatic vision of the person.

Hubble helps uncover origin of Neptune’s smallest moon Hippocamp

Astronomers using the NASA/ESA Hubble Space Telescope, along with older data from the Voyager 2 probe, have revealed more about the origin of Neptune’s smallest moon. The moon, which was discovered in 2013 and has now received the official name Hippocamp, is believed to be a fragment of its larger neighbour Proteus.

A team of astronomers, led by Mark Showalter of the SETI Institute, have used the NASA/ESA Hubble Space Telescope to study the origin of the smallest known moon orbiting the planet Neptune, discovered in 2013.

Hubble data showing Neptune’s inner moons. [NASA, ESA, and M. Showalter (SETI Institute]

“The first thing we realised was that you wouldn’t expect to find such a tiny moon right next to Neptune’s biggest inner moon,” said Mark Showalter. The tiny moon, with an estimated diameter of only about 34 km, was named Hippocamp and is likely to be a fragment from Proteus, Neptune’s second-largest moon and the outermost of the inner moons. Hippocamp, formerly known as S/2004 N 1, is named after the sea creatures of the same name from Greek and Roman mythology [1].

A moon also was stolen from Kuiper Belt

The orbits of Proteus and its tiny neighbour are incredibly close, at only 12 000 km apart. Ordinarily, if two satellites of such different sizes coexisted in such close proximity, either the larger would have kicked the smaller out of orbit or the smaller would crash into the larger one.

Orbit’s of Neptune’s inner moons. [NASA, ESA, and A. Feild (STScI)]

Instead, it appears that billions of years ago a comet collision chipped off a chunk of Proteus. Images from the Voyager 2 probe from 1989 show a large impact crater on Proteus, almost large enough to have shattered the moon. “In 1989, we thought the crater was the end of the story,” said Showalter. “With Hubble, now we know that a little piece of Proteus got left behind and we see it today as Hippocamp.”

Hippocamp is only the most recent result of the turbulent and violent history of Neptune’s satellite system. Proteus itself formed billions of years ago after a cataclysmic event involving Neptune’s satellites. The planet captured an enormous body from the Kuiper belt, now known to be Neptune’s largest moon, Triton. The sudden presence of such a massive object in orbit tore apart all the other satellites in orbit at that time. The debris from shattered moons re-coalesced into the second generation of natural satellites that we see today.

Later bombardment by comets led to the birth of Hippocamp, which can therefore be considered a third-generation satellite. “Based on estimates of comet populations, we know that other moons in the outer Solar System have been hit by comets, smashed apart, and re-accreted multiple times,” noted Jack Lissauer of NASA’s Ames Research Center, California, USA, a coauthor of the new research. “This pair of satellites provides a dramatic illustration that moons are sometimes broken apart by comets.”

Notes

[1] The mythological Hippocampus possesses the upper body of a horse and the lower body of a fish. The Roman god Neptune would drive a sea-chariot pulled by Hippocampi. The name Hippocamp was approved by the International Astronomical Union (IAU). The rules of the International Astronomical Union require that the moons of Neptune are named after Greek and Roman mythology of the undersea world.

Audio, Image, Video Compressions, for Non-Informaticians

Choosing the right codec for every project takes a lot of time. Not everybody is an informatician. My goal here is to reduce the amount of time that you need to spend in order to understand the simple and relevant compression ways.

Sometimes compression process causes some loss of information and sometimes not. In order to save some space, we trade-off some data by throwing it away. Lowering the resolution is one way of doing it. But first of all how does data compression work?

Representation of an Image

Representing digital image can take a lot of space. For example, if you want to store 4-minute song digitally, it would take up over 40 megabytes of space. 2-hour HD video would take 1000 gigabytes. To take up less space, in the real world the information is digitally compressed. The 40-megabyte song can be compressed to 4-megabyte, 2-hour video can be reduced to just 2-gigabyte etc.

But are we enough aware of our options when it comes to the compressions?

Here are some tips to lose less or not to lose any data.

Lose Data Compression Ways

  1. 1. Spatial compression: Identical pixels are not represented. Instead of coding the same pixels separately one selected pixel represents all. JPEG, MGPG, AVCINTRA, ALL-INTRA, Animated GIF are just some examples of lost information compression.
  2. 2. In Audio: FLAC and MP3
  3. 3. Temporal Compression: To reduce the size of a storage keyframes either are left uncompressed or only differences between frame are compressed. MPEG-1, MPEG-2, MPEG-4
  4. 4. Color Sampling (In French échantillonnage): Some parts of the colour is only registered.
samsungdisplay
colour sampling ways for VHS, some other cameras

Lossless Data Compression

It means that: the compressed data can be decompressed back into the exact original. Every single parts of data that was originally in the file, remains the way it was after it is uncompressed.

Png, GIF, ZIP, Motion JPEG 2000, VC-2 HQ, AV1, Apple Animation, FFmpeg, CorePGN are some of the examples of this type of compression. For more in detail: https://en.wikipedia.org/wiki/Lossless_compression 

My Suggestions

For your photographs you can use also psd, bmp or TIFF. In TIFF you can be sure that no data is being lost in the compression.

For audio: WAV is entirely uncompressed, but it takes a lot of space.

Video: If you want no image quality loss for video then HEVC (H.265) and MRP are the ones that I can suggest.

Don’t Afraid to Lose Data

It looks frightening at first, thinking about how much compromises you need to initiate. Using a lossy compression gives you more space in your data storage. But lowering down the quality forces you to accept the imperfection. However, sometimes lossy compression options for images can be a smart move to do. If the image is as good as to human eyes, another way to say, if visual quality is just about the same, why should we care about the technical loss?

On the other hand sometimes a highly compressed image from the beginning, especially in video, doesn’t allow for the necessary touches after. Such as low-quality footage in the first place makes the colour correction or contrast arrangements difficult afterward. That’s why for the video, it is a good idea to shoot your rushes (footage) in high quality. That will allow you to keep as much detail as possible from the start. But once you finish the editing and arrive to post production you can compress it by lossy compression. It is perfectly fine. You will see it is not going to make a very big difference.

Otherwise for audio storage, I suggest you do the opposite. Higher the quality better the benefit. Because adding changings on a sound wave afterward, won’t give the same calibre. Sound is not as suitable as the image files in terms of compression. When you lose the quality it is completely gone. That’s why for me, you should consider keeping the audio files as lossless as possible in your archive.

Analog vs. Digital recording? Which one is superior?

Both Digital or Analog recordings transform sound signals into an electrical signal. While performing that, analog use different method to replicate the original sound waves than digital. They both try to be close to the original sound as much as possible. But which recording is superior to the other one? Analog or Digital? That’s a difficult question to answer.

In AUDIO

Which one way is a better way to represent the true essence of a sound? Digital or Analog?

Vinyl and cassette tapes (magnetized tape) are examples of analog mediums for recording. The analog mediums record the sound on an imprinted surface which has ups and downs. That’s why when we listen to vinyl, it has this tiny imperfection sound comes which cause cracking and popping noise.

Digital system, on the other hand, exhibit cleaner audio by modulating the sound wave in a discrete method. Block of information arrives in a form of 1 or 0 coding system. The needle vibration sound problem is no longer there. However, because the sound hits in a composite of multiple layers (snapshots of a sound per second), it translates the information by bits and pieces.

In the binary system of 1 and 0 of digital transformation, we lose some of the original sound waves. Another way to say, digital system reproduces better the original audio but fails when it comes to having the closest essence of a sound that analog produces.

According to audiophiles, the purest sound originates from a vinyl recording, played on an analog sound system is the true essence of the sound. With today’s technology if you really want a high-quality sound system you need speakers, amplifiers, headsets, mixers etc. That will cost you definitely a fortune.

In VIDEO

Which image quality is better in cameras? Digital or Analog?

When it comes to visual image transmission, Analog vs. Digital competition comes back into play.

The most serious disadvantage of analog signals in imagery compared to the sound is that analog image transmissions always contain noise. It means that when the image is copied or processed the ‘noise’ reproduces for the next generations. Charges from the pixels must be converted first to a voltage. This is done with a capacitor circuit. Then the voltage levels must be measured and converted to a number. This is done with the help of a converter. It is called analog to digital converter (A/D).

A key advantage of a digital image, versus an analog image is the ability to make unlimited copies without any loss of image quality.

Both CCD (Charge-coupled Device) or CMOS(Complementary Metal-oxide Semiconductor) camera sensors convert light into electrons.

From 1960’s one colour charge band, CCD has become a major technology for digital imaging. CCD sensors are analog components that require more electronic circuitry outside the sensor. CCD’s are more expensive to produce and can consume up to 100 times more power than CMOS sensors. CCDs tend to be used in cameras that focus on high-quality images with lots of pixels and excellent light sensitivity.

Analog CCD vs. Digital CMOS ?

CMOS on the other hand, are produced in Silicon Valley. They are usually less expensive and have great battery life. The CMOS imager has a completely digital output. However, their sensors traditionally have lower quality, lower resolution, and lower sensitivity.

Hence, each CCD amplifier has higher bandwidth, which results in higher noise. Consequently, high-speed CMOS imagers can be designed to have much lower noise than high-speed CCDs.

What about Tri-CCD vs Tri-CMOS ?

Today, the technology is advanced. Therefore, if we compare the cameras with only one colour charge we see that it is replaced by three colour charge. But what does it mean?

Cameras generally provide image quality based on how many optical colour separations they exercise. Today’s cameras are using three colour charges. Those are the ones, we call main colours: which is Red, Green and Blue bands.

Conclusion

As a result, using digital cameras are a necessity more than a choice in terms of what they can provide. Such as more advanced features (Test Image, Stamp, Frame Counter, I/O Port Status, Error Checking, Partial Scan, Image Flip etc.) and higher resolution is not possible with analog cameras. However, using analog sound registration technologies is still a personal choice rather than a necessity.

Hubble sees the brightest quasar in the early Universe

Quasars are the extremely bright nuclei of active galaxies. The powerful glow of a quasar is created by a supermassive black hole which is surrounded by an accretion disc. Gas falling toward the black hole releases incredible amounts of energy, which can be observed over all wavelengths.

The newly discovered quasar, catalogued as J043947.08+163415.7, is no exception to this; its brightness is equivalent to about 600 trillion Suns and the supermassive black hole powering it is several hundred million times as massive as our Sun. “That’s something we have been looking for for a long time,” said lead author Xiaohui Fan (University of Arizona, USA). “We don’t expect to find many quasars brighter than that in the whole observable Universe!”


This image shows the distant quasar J043947.08+163415.7 as it was observed with the NASA/ESA Hubble Space Telescope. The quasar is one of the brightest objects in the early Universe. However, due to its distance it only became visible as its image was made brighter and larger by gravitational lensing. The system of the lensed images and the actual lens is so compact that Hubble is the only optical telescope able to resolve it.

[NASA, ESA, X. Fan (University of Arizona]

Despite its brightness Hubble was able to spot it only because its appearance was strongly affected by strong gravitational lensing. A dim galaxy is located right between the quasar and Earth, bending the light from the quasar and making it appear three times as large and 50 times as bright as it would be without the effect of gravitational lensing. Even still, the lens and the lensed quasar are extremely compact and unresolved in images from optical ground-based telescopes. Only Hubble’s sharp vision allowed it to resolve the system.

The data show not only that the supermassive black hole is accreting matter at an extremely high rate but also that the quasar may be producing up to 10 000 stars per year. “Its properties and its distance make it a prime candidate to investigate the evolution of distant quasars and the role supermassive black holes in their centres had on star formation,” explains co-author Fabian Walter (Max Planck Institute for Astronomy, Germany), illustrating why this discovery is so important.

Quasars similar to J043947.08+163415.7 existed during the period of reionisation of the young Universe, when radiation from young galaxies and quasars reheated the obscuring hydrogen that had cooled off just 400 000 years after the Big Bang; the Universe reverted from being neutral to once again being an ionised plasma. However, it is still not known for certain which objects provided the reionising photons. Energetic objects such as this newly discovered quasar could help to solve this mystery.

For that reason the team is gathering as much data on J043947.08+163415.7 as possible. Currently they are analysing a detailed 20-hour spectrum from the European Southern Observatory’s Very Large Telescope, which will allow them to identify the chemical composition and temperatures of intergalactic gas in the early Universe. The team is also using the Atacama Large Millimeter/submillimeter Array, and hopes to also observe the quasar with the upcoming NASA/ESA/CSA James Webb Space Telescope. With these telescopes they will be able to look in the vicinity of the supermassive black hole and directly measure the influence of its gravity on the surrounding gas and star formation.

Apollo 8 Astronaut: Sending People to Mars Would Be Stupid

Bill Anders, who is famous for the “Earthrise” photo which is the first ever picture of Earth from the orbit of the Moon taken by men, critized Elon Musk and Jeff Bezos on their highly ambitious space exploration projects. Anders also believes NASA is not ready to go back to the Moon and there’s not enough public support for it.

Bill Anders, who reached to the orbit of the Moon 50 years ago and took the photo of Earth’s rising over our satellites orbit has made serious criticism on space exploration projects of private aerospace firms as well as NASA.

Talking to BBC Radio 5 Live for the new documentary about Apollo 8 mission, Anders said that “it is not actually fun to make space travel and not so exciting to go to the Moon.” While talking about how hard is space travel, Anders argued that NASA should be spending its budget on other things instead of planning a manned mission to Mars.

“What’s the imperative? What’s pushing us to go to Mars? I don’t think the public is that interested,” said Anders and added that public doesn’t even interested in going to the Moon, which is true. According to Pew poll of 2018, only 18 per cent of US citizens think going to Mars is important, while 45 per cent say that it is lower priority. 37 per cent of Americans on the other hand, think it is unnecessary to go to the Red Planet, as Elon Musk is committed to make real.

You would think that Mars is bombarded with high radiation of the Sun and too far away and public’ opinion is not surprising. But general view towards going back to the Moon is almost the same as Mars for Americans. Only 13 per cent say that it is imperative, while 42 per cent think it’s lower priority. For 44 per cent, it is not important.

Earthrise. [NASA]

“Mars colony is nonsense”

What Anderson underlined in terms of space exploration is very important because many people are supporting the idea of “expanding the lifespan of Earth” instead of colonizing Moon or Mars. Some people support the idea that space agencies should be spending their budget to help the environment more than building spacecraft to explore deep space. Some scientists on the other hand, like NASA’s Umut Yıldız thinks that the budget of the weapons industry should be downsized and allocated for environmental projects.

Anders, is supporting the unmanned Mars missions in this concept but thinks going there is just a fantasy. Plans of Elon Musk’ SpaceX and Jeff Bezos’ Blue Origin are hard to believe for him. “There’s a lot of hype about Mars that is nonsense,” said Borman. “Musk and Bezos, they’re talking about putting colonies on Mars, that’s nonsense” he says.

Ilustration of SpaceX’ Starship on Mars colony. [SpaceX]

Commander of Apollo 8 mission, Frank Borman supported Anders on his thought about deep space missions. According to him, you will just puke all the way getting there to only see a “devastation”, “meteor craters” and “no color at all”, just different shades of grey.

Reminding us that the Apollo missions took place in a era of extreme competition against the Soviets, Anders believes NASA couldn’t go to the Moon today. “They’re so ossified… many of the centres are mainly interested in keeping busy and you don’t see the public support.”

Depite criticisms from Borman, Anders remained supportive of NASA and said “we need robust exploration of the Solar System and man is a part of that.”

It is hard to know what Elon Musk would think and if NASA would re-evaluate its robotic missions to the Moon after Anders’ comments. But space exploration is already a huge industry and it can not be limited when there are so many ambitious entrepreneurs like Musk, Bezos and Virgin Galactic CEO Richard Branson.

Sources

IFLS, Sciencealert, DijitalX

A Little Step Trough To Tesla’s Dream of Free Energy: Plancx

It is becoming evident that we are frequently in need of getting electricity quicker,cheaper and easier in today’s technology world day by day and electricity isthe lifeblood of modern society. Therefore, the action was taken intoconsideration of the production of Plancx solar charger that is designed tomeet the power of our devices which are being used to improve our business andsocial relationship or to get some fun at the beginning of this year.

In thisrespect, we tried to find out the best solar panel in the market and thelamination makes the panel more durable and efficient during research and development phase and these two main requirements also should get in production under the cheapest and super quality conditions.

As a result of research, we decided to move on the production with a panel, the most productive and serviceable one, includes Maxeon™ Gen III Solar cells producedby Sunpower, a corporation of The United States and a lamination helping to increase efficiency and durability of the panel is Ethylene Tetrafluoroethylene(ETFE).

In themeanwhile, we tried to reach a remarkable design that provides it more portableand can be got in every moment in our lives very easily by using these two bestproducts.

Finally, three Plancx solar chargers have been designed for the different purpose of use. These are;

Plancx City can generate 7 watts/1.4 amperes and is designed for easy carry with its tiny sizes in city life, especially for women bags.

Plancx Walk can generate 7watts/1.4 amperes and is designed as a mono block for outdoor activities like camp and beach usage with a bag on its backside to put any device in it.

PlancxRoad can generate 12 watts/2.4 amperes and is designed for quicker charging. Realusage tests done by us can be seen on the attached video.

These numbers given by producer related with power are result of technical tests. Amperes and watts may vary according to the amount of sunlight received.

https://youtu.be/Zbea4CRHe_Q

The world’s mosthandy solar panels are launching on Indiegogo on February 01th 2019.

FACTS

City: Weight: 0.150 kg(0.33 lbs.)            

Folded:16.5 cm x 15.5 cm x 0.1 cm (6.5 in x 6.1 in x 0.04 in )

Unfolded: 16.5 cm x 31.5 cm x 0.3 cm (6.5 in x 12.4 in x 0.12 in )

Power capacity:  7 watts / 5 volts / 1.4 amperes

Power capacity:  7 watts / 5 volts / 1.4 amperes

Walk: Weight: 0.217 kg (0.48 lbs.)            

Dimensions:17 cm x 31 cm x 0.5 cm (6,7 in x 12.2 in x 0.2 in )

Power capacity:  7watts / 5 volts / 1.4 amperes

Road: Weight: 0.218 kg (0.48 lbs.)           

Folded:25 cm x 16 cm x 0.5 cm (9.8 in x 6.3 in x 0.2 in )

Unfolded:25 cm x 33 cm x 0.1 cm (9.8 in x 13 in x 0.04 in )

Power capacity:  7watts / 5 volts / 2.4 amperes

The product has more than 5-year lifespan including full-time usage unless it is broken or not being exposed any intense heat.  Additionally, it can go on to generate electricity even if it is broken unless it split completely by having broken resistance capability. It is also most flexible, waterproof and dustproof panel in the market.  These capabilities make it definitely one of the sustainable eco-friendly energy sources. It has auto restart capability, too so if charging is gone in case of sunlight lost due to cloud or something like that, charging can start automatically after getting any sunlight accordingly. 

There is only one USB type-a port which is a common one on it compatible with almost all devices such as smartphones, tablets, kindles, speakers, smart watches, power banks and are several usage types like adhering to nonporous surfaces with suction cups, hanging it with carabiners or putting it upright. It is really easy to use, just plug the cable to the charger and turn it to the sun.

Besides, it has no battery so you don’t have to worry while you use it under the sun. Powerbanks provide a portable solution for added power to recharge devices. However, once they run out of power, they become useless like our need for sunlight but you do not have to wait to recharge to use it again like power banks. Just any sun rays make us happy to move on. Its charging is like a wall charger under full sun and like an USB port on the computer under the cloudy sun.

Price will be $24 for City, $22 for Walk, $29 for Road by getting a %25 off for Indiegogo early bird.

Communicate Better with your Audio Expert in 5 Minutes!

Welcome to the corner of Audio-Visual geek’s 

Today we will learn how to better communicate with the Sound Experts in the Audio-Visual sector. Are you a Videographer, podcaster or a film director? Then, you need to express your basic needs better for your audio professionals. You don’t need a long time to learn the basics.

Audio Vocabulary 

Technical professions such as acoustic engineering, sound engineering, audio engineering, mixer, sound designer, audio equipment testing, musician etc. has a unique way to explain the forms of a sound to each other. Communication in today’s world is not only limited between person to person but also for your own production sector.

Let’s learn how some basic vocabulary works in sound expert’s world:

What is a Sound?

The Sound is a vibration in the air. Another way to say is an oscillating wave. To describe the meeting of sound with the air, we use the term onde. The ondes provoke a sensation in our ears.

  1. The more the place where the sound is dense and strict, sound travels faster. For example, in the air sound travels 340 meters, in water 1400 meters, inside the metal 5000 meters fast.
  2. The sound has 4 main elements. The sound experts generally talk about a distinct sound. Which is characterized by how pitch (in French it is more used frequency), intensity (loudness), timbre and developing over time. The Pitch is a precept of a sound and can only be measured subjectively. But frequency has a scientific identification.

What is a Frequency?

Regardless of the object, vibration creates a sound wave. The back-and-forth vibrational motion of the particles creates 2 main sounds. A narrow band of frequencies and wideband frequencies. The number of cycles that a vibrating object completes in one second is called frequency. The other word of the cycle is a vibration.

  1. The most commonly used unit for frequency is the Hertz (abbreviated Hz). 1 vibration per second is 1 Hz. People can hear sounds at frequencies from about 20 Hz to 20,000 Hz. This is the maximum hearing range for humans. But dogs, for example, can hear approximately 40 Hz to 60,000 Hz.

sunisoid of a sound
physicsclassroom 

 

2. Sounds that pass the upper audible limit of human hearing of 20,000 Hz is called ultra-sounds. Lower than 20Hz is called infra-sounds. The more trebles frequency is increased, the more trebles move directionally. Between basses, mediums and trebles (In French Grave, Medium, Aigu) the basses go in every direction.

That’s why when neighbours complain about the noise, they complain about basses. Because we can prevent propagation of trebles but not basses.

What is a timbre?

Every instrument has its own voice (tone). The timbre of an instrument is made up of its unique vibrations. This uniqueness gives a personality to each instrument. Thanks to sound timbre we can differentiate the variation of tone qualities.

What is Intensity?

Intensity is a force that we use to play the instrument. The unity measure of Volume is decibels. The loudness of the sound is measured in decibels (dB).

There is no such a place like total silence on earth.

Just to give you an idea the desert has 0-10dB, sound studio has 10-20dB, missile rocket 180dB. If the sound doubles, it corresponds of 3dB of increase in its emission.

  1. Sound intensity level also known as acoustic intensity is defined as the power carried by sound waves per unit area in a direction perpendicular to that area. Sound intensity is the power per square meter. The common unit of power is the watt.
  2. Sound-level meter is the device for measuring the intensity of noise, music, and other sounds.

Why time matters?

The length or a duration of a concert depends on time. Especially if you need a specific music or a harmony in your orchestra it is vital to know when to terminate a sound. The unit of time in the audio-visual world is defined based on a second.

  1. In audio terms, the beginning of a sound is defined as an attack. So, the time that we put between the beginning of a sound and until it reaches its maximum is called an attack.

In order to become a great realisator (director), these basic concepts are your time-saving helpers. Having better Communication passes by learning how to use these vocabularies. The more you use the more you become a pro.

Dancing with the Enemy: Stellar duo in R Aquarii

This spectacular image — the second instalment in ESO’s R Aquarii Week — shows intimate details of the dramatic stellar duo making up the binary star R Aquarii. Though most binary stars are bound in a graceful waltz by gravity, the relationship between the stars of R Aquarii is far less serene. Despite its diminutive size, the smaller of the two stars in this pair is steadily stripping material from its dying companion — a red giant.

Years of observation have uncovered the peculiar story behind the binary star R Aquarii, visible at the heart of this image. The larger of the two stars, the red giant, is a type of star known as a Mira variable. At the end of their life, these stars start to pulsate, becoming 1000 times as bright as the Sun as their outer envelopes expand and are cast into the interstellar void.

The death throes of this vast star are already dramatic, but the influence of the companion white dwarf star transforms this intriguing astronomical situation into a sinister cosmic spectacle. The white dwarf — which is smaller, denser and much hotter than the red giant — is flaying material from the outer layers of its larger companion. The jets of stellar material cast off by this dying giant and white dwarf pair can be seen here spewing outwards from R Aquarii.

Occasionally, enough material collects on the surface of the white dwarf to trigger a thermonuclear nova explosion, a titanic event which throws a vast amount of material into space. The remnants of past nova events can be seen in the tenuous nebula of gas radiating from R Aquarii in this image.

R Aquarii lies only 650 light-years from Earth — a near neighbour in astronomical terms — and is one of the closest symbiotic binary stars to Earth. As such, this intriguing binary has received particular attention from astronomers for decades. Capturing an image of the myriad features of R Aquarii was a perfect way for astronomers to test the capabilities of the Zurich IMaging POLarimeter (ZIMPOL), a component on board the planet-hunting instrument SPHERE. The results exceeded observations from space — the image shown here is even sharper than observations from the famous NASA/ESA Hubble Space Telescope.

SPHERE was developed over years of studies and construction to focus on one of the most challenging and exciting areas of astronomy: the search for exoplanets. By using a state-of-the-art adaptive optics system and specialised instruments such as ZIMPOL, SPHERE can achieve the challenging feat of directly imaging exoplanets. However, SPHERE’s capabilities are not limited to hunting for elusive exoplanets. The instrument can also be used to study a variety of astronomical sources — as can be seen from this spellbinding image of the stellar peculiarities of R Aquarii.