• Welcome to BirdForum, the internet's largest birding community with thousands of members from all over the world. The forums are dedicated to wild birds, birding, binoculars and equipment and all that goes with it.

    Please register for an account to take part in the discussions in the forum, post your pictures in the gallery and more.
ZEISS DTI thermal imaging cameras. For more discoveries at night, and during the day.

New computer - recommendations? (1 Viewer)

Nick Leech

Well-known member
United Kingdom
My old PC is log in the tooth. Does anyone have recommendations for a new computer for photographic use - cataloging and Post-Processing? (plus will use for birding apps!).

Will be using DPP4 and Lightroom etc. Want a machine that doesn't take forever to open files, process images etc!

Never used a Mac, so will probably stick to a Windows PC.

I presume I will get a faster more powerful Desktop than laptop for a given budget?

- what level of processor?
- how much RAM?
- separate graphics card - which one, how much on-board memory?

Any specific recommendations for models of PC?

Decent performance, without breaking the bank!

Thanks!
 
Spec-wise

Processer i5 or i7
RAM at least 8 better with 12 or 16
Definitely a separate GPU with 1 or 2 GB (don't need more unless you will be doing gaming or video editing)

Desktops are cheaper for a given power than laptops, plus if you are doing photo editing you will want a larger screen than a typical laptop.

Manufacturer wise I think there is little to differentiate them. Using a local independent is good as you can physically take it back if it goes wrong, but there is so much choice that these decisions are usually budget led.

Most of the personal computer market has moved onto laptops so the desktop market is largely spilt between office machines with medium oomph but low storage and gaming machines with extreme oomph.

If you can put up with the funky cases and flashing lights a low spec gaming machine makes a good photo editing machine. Or you can get an office machine but spec greater storage.
 
Mac if you can afford it, I think most pros and serious amateurs use them?


A

My feeling is that the difference in ability has more or less disappeared. At this point, it is more a question of which way your brain works - win or mac. I have used macs in the past but never could get my brain to work that way.

Niels
 
I would get a solid state drive as your main boot disc (carrying the operating system and your software) for faster performance. The cheapest versions are SATA drives and are the equivalent of a normal hard drive - though you only need 250gb for this purpose coupled with a normal hard drive for storage. Even faster data transfer with a PCI SSD (I don't pretend to understand all this except the PCI SSD plugs straight into a PCI socket on the motherboard and a SATA SSD is connected like any other hard drive).


It's quite difficult to buy off the shelf with the sort of spec you probably need but there are plenty of opportunities for specifying exactly what you want with various suppliers. And I agree with Mono, if you can find someone close to you that's an advantage.


Bill
 
Hi Nick,
Do you plan to build your own or buy one off the shelf? It is easy to assemble your own and you'll know where the bits came from, but there is no cost savings, just learning. Apart from that, your budget really determines the tradeoffs.
By all accounts, Lightroom is a resource hog, so you want a higher end processor and plenty of memory. Memory is cheap, 32 Gigs is a reasonably future proof number.
AMD now offers very competitive processors to Intels, but you'll need AMD compatible motherboards, not quite the same as Intels, even though both take the same memory, video cards and storage.
Storage definitely should begin with a solid state drive, Micron offers a good MX500, 500 gigs for about $200, if the default Samsung choice is not your taste. Add a few terabyte regular drive for storage, again cheap.
Video is an issue, Nvidia currently rules, but AMD Radeon cards work well with the AMD processor line. Note supply of higher end cards is currently tight because of diversion to the bit coin mining effort.

Frankly, the experience of building your own is well worth it, but it will suck up time that perhaps you prefer to spend on photography or birding. Good luck and please keep us posted.
 
Sorry about the duplicate threads. I wasn't sure which was the more appropriate place to put the thread - this seemed a more general location (but less people view it?), and the other location is well-read, but is maybe more software-orientated (Lightroom/Photoshop etc)?

My apologies for any confusion caused!
 
Hi Nick, no problem, I see the new posts in all forums so was seeing both of your threads. Good luck with your computer search.

Dave
 
Are you also planning on purchasing a new monitor?

If you're looking at 4K video editing/viewing, your monitor or TV is going to need HDMI 2.0 ports or DisplayPorts both of which are found on 4K video cards. However, just because they have both ports doesn't mean the resolution out of each port is 4K video. Just as the input ports on a monitor don't automatically have the ability to receive 4K video.

Anything less than a HDMI 2.0 or DisplayPorts on the monitor/TV won't give you the data transfer to have the 4K experience even thou you may have a 4K video card in your computer. The HDMI 2.0/DisplayPorts provide the 18Gpbs needed for the deep colors and refresh rate of the video. More color = more bits = higher data rate = required more bandwidth.

If you have the earlier HDMI 1.4 ports on your monitor/TV your transfer rate/bandwidth is 10.2Gpbs.

Without the ability to transfer more data, your on screen visuals (resolution) will basically be the same as it is now without a 4K video card. Simply put, your applications have to match; monitor/TV to the video card.

As for the refresh rate, your probably experiencing 30fps at 60hz if your method of transfer is HDMI 1.4. I believe this will stay the same if your applications don't match otherwise you can get the 60fps or higher but how much I don't know.

This is my understanding. I may be thinking wrong at which point I would love to be corrected.

As for the GPU/Video Card suggestion, you won't be able to get 4K video from 1G or 2G cards. 4G cards offer 4K video but you still need to be aware of input/out (IO) matching applications for the monitor.

Older 4K cards don't automatically come with the necessary IO ports. Example would be a card with 4K output only from the DisPlayPort but the monitor lacks a DisplayPort.

Even on 1G or 2G cards you may need an DVI-to-HDMI adapter because some of the older cards did not have HDMI or DisPlayPorts.

If you're purchasing a new computer this stuff will be clarified in the specs; listing the resolution output for each port on GPU/Video Card.

If you buying a used computer and planning to upgrade, the resolution output will be listed on the GPU/Video Card box. You'll have to google the card if you're buying a used computer with a card already installed.

If you're going to upgrade yourself or purchase an upgraded computer you also need to be aware of the computer's power supply wattage. I believe a 4K card calls for a minimum 400W PSU. Your 300W PSU won't last long if you install a 4K card.

I don't know anything about 6K or 8K.

Good Luck my friend.
 
Are you also planning on purchasing a new monitor?

If you're looking at 4K video editing/viewing, your monitor or TV is going to need HDMI 2.0 ports or DisplayPorts both of which are found on 4K video cards. However, just because they have both ports doesn't mean the resolution out of each port is 4K video. Just as the input ports on a monitor don't automatically have the ability to receive 4K video.

Anything less than a HDMI 2.0 or DisplayPorts on the monitor/TV won't give you the data transfer to have the 4K experience even thou you may have a 4K video card in your computer. The HDMI 2.0/DisplayPorts provide the 18Gpbs needed for the deep colors and refresh rate of the video. More color = more bits = higher data rate = required more bandwidth.

If you have the earlier HDMI 1.4 ports on your monitor/TV your transfer rate/bandwidth is 10.2Gpbs.

Without the ability to transfer more data, your on screen visuals (resolution) will basically be the same as it is now without a 4K video card. Simply put, your applications have to match; monitor/TV to the video card.

As for the refresh rate, your probably experiencing 30fps at 60hz if your method of transfer is HDMI 1.4. I believe this will stay the same if your applications don't match otherwise you can get the 60fps or higher but how much I don't know.

This is my understanding. I may be thinking wrong at which point I would love to be corrected.

As for the GPU/Video Card suggestion, you won't be able to get 4K video from 1G or 2G cards. 4G cards offer 4K video but you still need to be aware of input/out (IO) matching applications for the monitor.

Older 4K cards don't automatically come with the necessary IO ports. Example would be a card with 4K output only from the DisPlayPort but the monitor lacks a DisplayPort.

Even on 1G or 2G cards you may need an DVI-to-HDMI adapter because some of the older cards did not have HDMI or DisPlayPorts.

If you're purchasing a new computer this stuff will be clarified in the specs; listing the resolution output for each port on GPU/Video Card.

If you buying a used computer and planning to upgrade, the resolution output will be listed on the GPU/Video Card box. You'll have to google the card if you're buying a used computer with a card already installed.

If you're going to upgrade yourself or purchase an upgraded computer you also need to be aware of the computer's power supply wattage. I believe a 4K card calls for a minimum 400W PSU. Your 300W PSU won't last long if you install a 4K card.

I don't know anything about 6K or 8K.

Good Luck my friend.

All spot on!
Video is currently evolving rapidly in terms of resolution, frame rate, color depth and contrast, really out pacing the standards setting bodies. The latest proposed standards would stream 50 GB/sec, adequate for 8K video, but afaik, the hardware is still very spotty.
Still photographers are somewhat less impacted fortunately, as their image data does not need the refresh rates video requires. So older/slower video cards may be quite adequate for their needs. However, balancing display quality and budget remains a challenging task. The market for photography oriented computer setups is too small to have attracted many competitors, instead most packages are offered with consumer focused displays that are bigger, but lack the color accuracy photographers require.
The iMac 5K is a good option by all accounts, but carries the usual Apple premium. A used one from a pro photographer is available here for $2400,
interesting because the configuration is indicative of what is used in practice.

https://diglloyd.com/blog/2018/20180208_0812-Lloyds-gear-sale.html
 
Irrespective of platform, I would recommend using two monitors if you have the space. Nearly all modern graphics cards will support this. My studio setup has two desktop computers, both have 19" and 28" monitors attached. The advantage is you can put all the tools on one screen and the work area on the other, letting you focus on the image without any screen clutter.
 
Mac if you can afford it, I think most pros and serious amateurs use them?
If you're going to work in isolation (or with just a few other friends you need to be compatible with) and like Windows, there is nothing wrong with a PC, even for Photoshop.

If you can build your own PC, you can save alot of money and get a nice rig.

If you plan to up your game and need a wider group of help in photography, there may be some logic in getting a Mac. The Photoshop talent tends to be on that side of the fence.

If you have an iPhone, then a Mac makes more sense (and vice versa).

I use a Mac (and also iOS by extension), but I'm in the middle of a long career in graphics and marketing and thus have used Macs for decades. I've also built my own PC rigs (primarily for gaming), so it's not like I'm totally anti-Windows either.
 
Warning! This thread is more than 6 years ago old.
It's likely that no further discussion is required, in which case we recommend starting a new thread. If however you feel your response is required you can still do so.

Users who are viewing this thread

Back
Top