Welcome to City-Data.com Forum!
U.S. CitiesCity-Data Forum Index
Go Back   City-Data Forum > General Forums > Entertainment and Arts > Movies
 [Register]
Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
View detailed profile (Advanced) or search
site with Google Custom Search

Search Forums  (Advanced)
Reply Start New Thread
 
Old 04-08-2018, 07:20 AM
 
12,918 posts, read 16,881,928 times
Reputation: 5434

Advertisements

When I was a kid in the late 70's I remember I would always take a mental note of a particular movie that I liked and their film quality. Paramount, Warner Bros, MGM, Universal, etc. For example, there were major films I liked such as "Grease" that looked like they had a grainier quality. The same with some of Speilberg's movies. I'm talking about major films. But another less movie from the same period or earlier, even possibly a lower budget film, would look cleaner and clearer.

I remember liking Paramount films. But they may have been grainier. And so I would even associate that graininess with some unknown higher quality that I didn't understand. TV shows would often have this cleaner look. (Filmed drama shows, rather than live and videotaped.)

Does anyone know what would have caused these differences?
Reply With Quote Quick reply to this message

 
Old 04-08-2018, 08:37 AM
 
Location: Maine
22,932 posts, read 28,310,390 times
Reputation: 31278
I never thought about it from studio-to-studio, but even as a kid I noticed the difference in film quality between movies from the 1960s and 1970s and then the 1980s going forward. I couldn't have explained to you why the movies looked different, but I noticed they did.
Reply With Quote Quick reply to this message
 
Old 04-10-2018, 05:26 PM
 
15,590 posts, read 15,699,568 times
Reputation: 22004
Sure. All studios have studio heads, and they may have personal taste that favors particular kinds of movies, or particular looks. More often, though, I think that these days it's more due to the taste of the particular director. In the big-studio era, studios had particular niches, such as gritty crime films or glossy romances. Also, over time, as Mark says, sometimes different kinds of film were developed.

Funny, I use the same word, "clean."
Reply With Quote Quick reply to this message
 
Old 04-10-2018, 05:53 PM
 
12,918 posts, read 16,881,928 times
Reputation: 5434
I don't really notice the differences anymore. If I see a new movie on TV I can't really tell the difference between it and a filmed TV show. Now I'm beginning to wonder if it was all just my imagination when I was a kid. "Grease" and "Jaws" today look just as clean as any TV show. Maybe my mind was just imagining a difference due to the different nature of the storytelling, and viewing them as very different things in that sense. I remember wondering if there was a difference between Battlestar Galactica, the original series, and the movie of the same name around 1979. I couldn't really tell. And I'm sure that I used to know what film studio created them.
Reply With Quote Quick reply to this message
 
Old 04-10-2018, 06:41 PM
 
5,110 posts, read 3,077,872 times
Reputation: 1489
One thing I noticed as a kid in the 90s is how Universal movies had a softer look to them, compared to Columbia movies, not sure why.
Reply With Quote Quick reply to this message
 
Old 04-10-2018, 08:14 PM
 
12,918 posts, read 16,881,928 times
Reputation: 5434
Quote:
Originally Posted by ironpony View Post
One thing I noticed as a kid in the 90s is how Universal movies had a softer look to them, compared to Columbia movies, not sure why.
Interesting. It could have been psychological. Or maybe it was real, like something about the way they produced movies.
Reply With Quote Quick reply to this message
 
Old 04-10-2018, 10:12 PM
 
5,110 posts, read 3,077,872 times
Reputation: 1489
Maybe it's psychological. But when digital color grading came out and I saw The Mummy back in 1999, then the difference went away as the movies had adapted completely new looks.
Reply With Quote Quick reply to this message
 
Old 04-11-2018, 08:15 PM
 
23,611 posts, read 70,493,499 times
Reputation: 49323
This gets technical real quick, so I'll try to just give an overview.

"When I was a kid in the late 70's I remember I would always take a mental note of a particular movie that I liked and their film quality. Paramount, Warner Bros, MGM, Universal, etc. For example, there were major films I liked such as "Grease" that looked like they had a grainier quality. The same with some of Speilberg's movies. I'm talking about major films. But another less movie from the same period or earlier, even possibly a lower budget film, would look cleaner and clearer.

I remember liking Paramount films. But they may have been grainier. And so I would even associate that graininess with some unknown higher quality that I didn't understand. TV shows would often have this cleaner look. (Filmed drama shows, rather than live and videotaped.)"


Start with TV. Television shows had great studio lighting. When the shows were recorded in that era, it was either 2" or (less likely) 1" video tape, 35mm film, or rarely 16mm film. 2" tape had about the best quality that a television set could recreate and it was "smooth" to look at within the limitations of the imaging circuitry. 16mm was used primarily for local programming The network shows were generally recorded on 35mm, and especially that programming that might have residual value..

When you have great consistent studio lighting, a controlled environment, and a dedicated (union) crew, the result can be amazing. It is a craft, these are craftspeople, and the output was optimized for gamma, color balance, and viewing on the sets of the time - as well as... being clean enough for later syndicated re-release.

If you want a comparison of film vs. video, look at the Monty Python TV show. Although it is PAL and not NTSC, indoor scenes were video taped and outdoor scenes were 35mm, by contract with the unions in the U.K.

Next, there is the issue of film speed and grain. Films that are capable of recording images in low light conditions have larger grain structure. A candlelight scene that doesn't have auxiliary lighting requires an extremely high speed film. If a film has a number of scenes in low light, it might be a choice to use a faster film for the entire film, for the sake of continuity of the image quality. That means many intense personal drama films, with high artistic values and adult themes might have larger grain structure. Films that had more mainstream appeal would try to avoid the larger grain, as it could be viewed as an affectation.

Then there is the physical size of the film. A few films were shot 16mm and blown up to 35mm. There was an intermediary size of super 16, developed and promoted by someone I knew in upstate NY. Then there was the standard 35 mm frame, which had any number of variations. After that was 65mm and 70mm. Cinerama had died out by then, but Todd-A-O was still around.

Next (avoiding the whole three strip Technicolor process) were the labs and how they handled things. The original camera negative was generally edited and then duplicated and then contact prints made from the high quality duplicates. Most of the work was done with the Kodak process, but Fuji was breaking in at the time.

The film stock changed in the 1970s from acetate to polyester, which had some effect, but overall the Eastmancolor emulsion was used. Release prints that went out to the theatres were of varying quality. If the prints were rushed, there could be a variety of problems, such as lack of lubrication, which required putting wax or oil on the edges of the film, or even bad prints. I got one reel of the original "Star Wars" with the stock reversed, so the soundtrack emulsion was on the image area and the soundtrack printed in the mud of what was normally image. Those prints "went pink" quickly as they aged. A circuit print that was a few years old or had been stored in heat might have no greens or blues. Films like "Pippi Longstocking" and "Song of Norway" and "Santa vs. the Martians" could be painful to watch at first, until the brain started to compensate.

Once the film got to a theatre, there were other areas of difference. Some lenses were, to put it politely, sh*t. Others were crystal sharp. Some projectors had judder and some were stable. Films that were not meant for mass market were shown in smaller auditoriums that gave a brighter picture and often had smaller aperture lenses that were sharper, showing more of the grain structure.

If you ever get a chance to see movies as they were meant to be, jump to do so. The original films were small grain on nitrocellulose (gun cotton) film and are simply gorgeous to see when projected with a projector using a carbon arc lamphouse. I had been in the industry a few years before I got a chance to show a film like that and I was stunned. As the saying goes, "You ain't seen nuthn' yet."
Reply With Quote Quick reply to this message
 
Old 04-12-2018, 02:52 AM
 
12,918 posts, read 16,881,928 times
Reputation: 5434
Quote:
Originally Posted by harry chickpea View Post
This gets technical real quick, so I'll try to just give an overview.

"When I was a kid in the late 70's I remember I would always take a mental note of a particular movie that I liked and their film quality. Paramount, Warner Bros, MGM, Universal, etc. For example, there were major films I liked such as "Grease" that looked like they had a grainier quality. The same with some of Speilberg's movies. I'm talking about major films. But another less movie from the same period or earlier, even possibly a lower budget film, would look cleaner and clearer.

I remember liking Paramount films. But they may have been grainier. And so I would even associate that graininess with some unknown higher quality that I didn't understand. TV shows would often have this cleaner look. (Filmed drama shows, rather than live and videotaped.)"


Start with TV. Television shows had great studio lighting. When the shows were recorded in that era, it was either 2" or (less likely) 1" video tape, 35mm film, or rarely 16mm film. 2" tape had about the best quality that a television set could recreate and it was "smooth" to look at within the limitations of the imaging circuitry. 16mm was used primarily for local programming The network shows were generally recorded on 35mm, and especially that programming that might have residual value..

When you have great consistent studio lighting, a controlled environment, and a dedicated (union) crew, the result can be amazing. It is a craft, these are craftspeople, and the output was optimized for gamma, color balance, and viewing on the sets of the time - as well as... being clean enough for later syndicated re-release.

If you want a comparison of film vs. video, look at the Monty Python TV show. Although it is PAL and not NTSC, indoor scenes were video taped and outdoor scenes were 35mm, by contract with the unions in the U.K.

Next, there is the issue of film speed and grain. Films that are capable of recording images in low light conditions have larger grain structure. A candlelight scene that doesn't have auxiliary lighting requires an extremely high speed film. If a film has a number of scenes in low light, it might be a choice to use a faster film for the entire film, for the sake of continuity of the image quality. That means many intense personal drama films, with high artistic values and adult themes might have larger grain structure. Films that had more mainstream appeal would try to avoid the larger grain, as it could be viewed as an affectation.

Then there is the physical size of the film. A few films were shot 16mm and blown up to 35mm. There was an intermediary size of super 16, developed and promoted by someone I knew in upstate NY. Then there was the standard 35 mm frame, which had any number of variations. After that was 65mm and 70mm. Cinerama had died out by then, but Todd-A-O was still around.

Next (avoiding the whole three strip Technicolor process) were the labs and how they handled things. The original camera negative was generally edited and then duplicated and then contact prints made from the high quality duplicates. Most of the work was done with the Kodak process, but Fuji was breaking in at the time.

The film stock changed in the 1970s from acetate to polyester, which had some effect, but overall the Eastmancolor emulsion was used. Release prints that went out to the theatres were of varying quality. If the prints were rushed, there could be a variety of problems, such as lack of lubrication, which required putting wax or oil on the edges of the film, or even bad prints. I got one reel of the original "Star Wars" with the stock reversed, so the soundtrack emulsion was on the image area and the soundtrack printed in the mud of what was normally image. Those prints "went pink" quickly as they aged. A circuit print that was a few years old or had been stored in heat might have no greens or blues. Films like "Pippi Longstocking" and "Song of Norway" and "Santa vs. the Martians" could be painful to watch at first, until the brain started to compensate.

Once the film got to a theatre, there were other areas of difference. Some lenses were, to put it politely, sh*t. Others were crystal sharp. Some projectors had judder and some were stable. Films that were not meant for mass market were shown in smaller auditoriums that gave a brighter picture and often had smaller aperture lenses that were sharper, showing more of the grain structure.

If you ever get a chance to see movies as they were meant to be, jump to do so. The original films were small grain on nitrocellulose (gun cotton) film and are simply gorgeous to see when projected with a projector using a carbon arc lamphouse. I had been in the industry a few years before I got a chance to show a film like that and I was stunned. As the saying goes, "You ain't seen nuthn' yet."
Thanks. For some reason I always pictured some shows that were done on video as being well lighted.

I used to be able to tell the difference between a soap-opera (video) and a movie (film). I'm wondering if they are also being recorded in digital today like movies are, and if there is still that difference. Is it all the same today? It seems like there is a difference but I can't tell if it's my imagination.

I also used to wonder why movies were not videotaped. To me, a live broadcast always looked more "clear" than a movie. So wouldn't the filmmakers want their films to look as clear as possible?
Reply With Quote Quick reply to this message
 
Old 04-12-2018, 03:26 PM
 
23,611 posts, read 70,493,499 times
Reputation: 49323
Quote:
Originally Posted by OzzyRules View Post
Thanks. For some reason I always pictured some shows that were done on video as being well lighted.

They were. For archival purposes kinescope was used. https://en.wikipedia.org/wiki/Kinescope

I used to be able to tell the difference between a soap-opera (video) and a movie (film). I'm wondering if they are also being recorded in digital today like movies are, and if there is still that difference. Is it all the same today? It seems like there is a difference but I can't tell if it's my imagination.

Soap opera lighting is different. In a movie, a close-up shot is a separate set-up, and selective hair lights, key lights, fill lights and effects are used because there is the opportunity to do it "right." In a soap opera, the general lighting is flatter and more forgiving, and to have an actor hit a precise mark for a close-up difficult, so the camera grabs the live action as it happens. Almost everything is digitally recorded at this point.

I also used to wonder why movies were not videotaped. To me, a live broadcast always looked more "clear" than a movie. So wouldn't the filmmakers want their films to look as clear as possible?
Videotape is a sh*t media. The maximum theoretical resolution is only 528 vertical lines, there are interlacing issues, and the gamma range and resolution are NOTHING compared to film. Video looked "clear" on tv sets because it was a complete end-to-end system where every detail of reproduction had been thought out and flaws compensated for. Film that was shown on tv had to have a strange pull-down arrangement. Film is shot at 24 frames per second, television is a nominal 30 frames per second, so until more modern and intelligent systems were devised to compensate, film always had a "look" on television. A DVD or Bluray of a film projected at 1080p is simply awesome compared to what had to be accepted on the old CRT televisions.

Filmmakers have to conform to standards if they want their films played. There have been fantastic variations in process, such as 30 fps film, Showscan, Imax, and others, but most theatres only had equipment for standard 35mm film shown at 24 fps. Nothing was ever going to substantially change in either television of motion picture technology until digital and the complete revamp.

Reply With Quote Quick reply to this message
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.

Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.


Reply
Please update this thread with any new information or opinions. This open thread is still read by thousands of people, so we encourage all additional points of view.

Quick Reply
Message:

Over $104,000 in prizes was already given out to active posters on our forum and additional giveaways are planned!

Go Back   City-Data Forum > General Forums > Entertainment and Arts > Movies

All times are GMT -6.

© 2005-2024, Advameg, Inc. · Please obey Forum Rules · Terms of Use and Privacy Policy · Bug Bounty

City-Data.com - Contact Us - Archive 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37 - Top