The post Unlocking the Power of Pixels: A Guide to Understanding Graphic Cards appeared first on PC Tech Magazine.
]]>A graphics card also referred to as a video card and is a printed circuit board containing the GPU and the Video Random Access Memory (VRAM). It produces and processes the result of images and video for display on a monitor. In the absence of a graphics card, a computer could only be capable of displaying simple forms of graphics.
A graphics card is a hardware accelerator with numerous chief modules that generate graphics that users witness. Let’s break down the main components:
The GPU is that tiny heart that powers each graphics card. It is a digital processer that was designed mainly for unusual computer graphics to be processed at high speeds. Today’s GPUs feature hundreds of cores and sometimes even thousands of cores that render great parallel computations.
All the visible graphic content such as texture, models, or framebuffer is stored in the buffer memory that we call VRAM at a higher speed. This means that more VRAM facilitates quick retrieval of more graphic data by a particular graphics card. Typically today’s VRAM is ranging from 4-16 GB.
The DAC converts the digital signal that is bi-polar and two valued into an analog form for usage by the display devices. This enables visual output.
A video BIOS gets into some components within the card and it has low level software that are useful in the card’s basic operations.
HDMI, DVI, or VGA used to take graphics card output to displays, but that’s not as common anymore. Many cards include a number of ports to allow for different kinds of configurations.
Graphics cards bring a lot of heat from the computations made on the card because of the graphics algorithms. The temperature is therefore controlled using heat sinks, heat pipes and multiple fans, rather than with critical temperature levels being reached. There are also such possibilities as, for instance, liquid cooling and vapor chambers.
Deciding on what specific graphics card to purchase can often be a very challenging endeavor because of the large number of available units. Here are some key factors to consider:
Decide why you want the card. High definition gaming, film editing, 3D designing and even data analysis requires more energy. For normal office use or simple operations where there will not be a lot of graphic software running, then integrated graphics will do.
The GPU is the main single component on a graphics card and is typically described as defining it. Currently, there are two companies in the market that are major providers of GPUs: Nvidia and AMD. Find out which GPU will be the most suitable for your planned purpose and offer the most performance per dollar.
More VRAM means that a GPU can process more graphical data at once. More VRAM allows a GPU to fetch more graphical information concurrently. Four to eight gigabytes of VRAM are sufficient for 1080p gaming and twelve to sixteen gigabytes of VRAM are beneficial in 4K video editing.
Except check that your graphics card supports the type of connections to display(s) you want, HDMI 2.1 or DisplayPort 1.4, for example.
Accurate airflow maintains a graphics card at specific temperatures in with demanding operations. That’s why, it is required to cool the cards well.
Some higher form factors need additional PCIe power connectors from your power supply unit. Check that it is suitable for the card and check the availability of extra PCIe connectors.
Asus, MSI, Gigabyte, EVGA, and Zotac are some of the best workstation graphics cards, or rather their brands. They provide reliable and customer care services by working with their customers.
See also: Techie’s guide: How to choose the best laptop for yourself
Occasionally, you may encounter issues with your graphics card. Here are some common problems and troubleshooting steps:
No display: If your monitor displays no signal, then try updating your video card’s driver. Switch to Device Manager, right click the display adapter, select uninstall, reboot your computer, and update the necessary driver.
Artifacts on the screen: Dynamic objects such as flickers, a dot or lines on your screen may be pointing towards a faulty graphics card. You might give it a try to perform a clean driver installation after DDU that uninstalls the prior driver.
Game crashes or freezes: Some game issues may be attributed with graphics driver. After you apply DDU, you’ll need to let it uninstall your current driver to the best of its ability then, get the newest driver directly from the manufacturer.
Low frame rates: If you find ourself at a lower FPS simply reset you graphics settings or try a clean driver install. Experiences have also shown that it is possible to use the CPU load and temperature to diagnose performance problems.
BSOD (Blue Screen of Death): This is a critical system error that can be caused by various factors, including faulty hardware or driver conflicts. Try resetting a graphics card driver or reinstalling it. If the problem persists, it may indicate a hardware issue with the card.
In the world of graphics cards, new technologies and innovations appear every single day. Here are some trends shaping the future of graphics cards:
Ray Tracing
This enhanced lighting approach creates natural light physics for a very realistic and extremely realistic effect. Nvidia RTX has its own ray tracing units incorporated in the graphics card. AMD also supports ray tracing. Someone expect it to bacterial vaginosis become more widespread.
Nvidia DLSS as well as the AMD FSR take advantage of artificial neural networks to increase the frame rate. Future advancement in using AI will improve the speed of rendering images as well as the quality of the images created.
Cloud Gaming
Companies such as Playgiga are making game streaming procedures in which you do not need a strong equipment at your home. Visuals are run on dedicated server computers that have GPUs. However, most mobile broadband services are still limited by maximum monthly transfer allowance also known as bandwidth caps.
Virtual Reality (VR) and Augmented Reality (AR)
Consistent high frame rate stereo graphics quickly becomes a GPU resource-intensive process, driving the importance of VR headsets. Subsequent versions are likely to take things to the next level where GPUs are concerned. AR glasses are also set to enjoy evolutions in the graphics tech field.
Isolated to the graphics cards, innovation remains an unstoppable process, and the following innovations clearly stand for it. As GPUs grow ever more potent, greener, and intelligent year by year due to AI and micro-specialized cores, video games, artistic software, and tech applications in games, virtual Reality will come again to an entirely new visual level.
The post Unlocking the Power of Pixels: A Guide to Understanding Graphic Cards appeared first on PC Tech Magazine.
]]>The post 5 Essential Features of Sonic Switch Technology appeared first on PC Tech Magazine.
]]>In the past, switches usually had special software from just one company. However, this made it hard for network operators to control everything the same way or see everything together. Sonic switches are different because their code is open for anyone to change. It also allows Sonic switches from multiple companies to work together more efficiently.
As more data centers use Sonic switches, they have proven very helpful for connecting vast numbers of computers. This is because Sonic switches have a structure and rules that make them flexible. Engineers can customize Sonic switches to best suit their unique network needs. Sonic switches also automatically share work between many Sonic switches and CPUs.
These are five key elements of Sonic switches that make them suitable for software-defined networking in today’s most prominent computer networks.
Because Sonic switches are open-source, operators fully control the software. Therefore, they can customize it to best accommodate their unique network needs. Previously, proprietary switch software offered less customization flexibility. Therefore, engineers can seamlessly apply identical features across all Sonic switches, regardless of the underlying hardware differences.
An interface independently handles network functions and switch designs, streamlining deployment and management. Consequently, engineers can uniformly administer Sonic switches from diverse vendors using consistent tools.
Moreover, features such as VLANs, access controls, routing protocols, and more can optionally be turned on or off per Sonic switch as required. Likewise, Sonic switches integrate well with open-source management platforms such as Ansible, permitting consistent automated configuration of entire fleets. Ultimately, the adaptable design of Sonic switches empowers engineers to build infrastructure solutions without restrictions from vendors.
As cloud computing and immense data centers continue to grow to meet more demand, handling massive networks is essential for switch systems. Sonic switches have been made with immense size in mind since the start at Microsoft, which runs some of the world’s biggest internet networks. Therefore, they have been optimized to support massive networks since their early development.
See also: Emerging trends in IT consulting: cloud computing, AI, and cybersecurity
In addition, Sonic switches support enormous routing tables, vast numbers of interfaces and VLANs, and switching speeds measured in thousands of billions of bits. Its modular design allows for more line cards and increased speed handling over time as needs change. As a result, the Sonic switch stays useful even as requirements expand continuously. They also fully utilize innovations in the Linux operating system to use today’s powerful CPUs and memory. For this reason, Sonic Switch maintains high performance because it focuses on scale and efficiency.
Also, some key abilities include managing millions of routes, hundreds of thousands of VLANs, speeds of terabits on switching circuits, and massive BGP and VxLAN tables sharing addresses. Likewise, Sonic switches move network data at wiring-link speeds without compromise by expertly balancing features and performance. Consequently, these scaling traits prove Sonic to be an industrial-grade switching OS capable of the most significant cloud and huge workloads.
Computer networks in data centers need all equipment to share the same tools to simplify management. Sonic switches, which use SAI, make this possible. SAI provides a layer independent of specific switch designs and companies. Any network function defined through the SAI can efficiently work on all Sonic-supported switches.
Therefore, regardless of the manufacturer, every Sonic switch can have matching tools. Examples include technologies for connecting devices, finding neighbors on the network, secure virtual private networks, fast 25Gb connections, and more— all working the same across different Sonic switches.
Having consistent tools applied identically streamlines network preparation, troubleshooting, and change management. Network operators can pick the best Sonic switches based on size, ports, or cost instead of being restricted to just one company’s options. Sonic’s SAI approach also means Sonic switches from multiple vendors are guaranteed to work together based solely on networking needs.
Keeping networks running is crucial, but ensuring consistency across systems using vendors is hard. Testing by cloud experts like Microsoft proves Sonic’s strength and ability to handle massive-scale traffic without issue.
Also, Sonic switches use automatic checks of every change, continuous practices combining updates, and security fixes from Linux. Its developers also use Sonic switches personally, guaranteeing that the code works well installed. This real testing has toughened Sonic against failures through strict baseline rules.
The money needed to use open-source Sonic switches is much less than that of proprietary network OS licenses. Sonic switches can run on basic switch circuitry without extra fees for each port, which raises hardware expenses. Its flexibility in improving current Sonic switches also extends usable lifetimes.
Standard programming models and equal features among Sonic switches simplify daily operations, too. Less assistance is required to switch OS versions separately. Standard platforms with Sonic switches decrease excess code and configuration variation. Automation is more straightforward and has consistent connections to physical equipment.
Overall infrastructure costs are lowered through better utilization, less support workload, and optimized initial hardware spending. As more Sonic switches are adopted, accelerated development further cuts licensing fees through open competition. These affordable advantages let modern networks grow budgets appropriately.
Thus, Sonic switches have proven to be powerful networking software that connects huge groups of computers smoothly. Its openness lets people customize it to match their unique network needs perfectly. Whether handling millions of devices, applying consistent tools, ensuring reliability, or reducing long-term expenses —Sonic brings many advantages over proprietary options.
See also: Top 7 tools for setting up your home network
The post 5 Essential Features of Sonic Switch Technology appeared first on PC Tech Magazine.
]]>The post Taiwan Semiconductor Stock: Evaluating Potential appeared first on PC Tech Magazine.
]]>The company collaborates with major clients such as HiSilicon, MediaTek (TWSE:2454), Huawei, Realtek (TWSE:2379), AMD (NASDAQ:AMD), NVIDIA (NASDAQ:NVDA), Qualcomm (NASDAQ:QCOM), ARM Holdings (NASDAQ:ARM), Altera, and Xilinx, as well as with Apple (NASDAQ:AAPL), Broadcom (NASDAQ:AVGO), Conexant, Marvell (NASDAQ:MRVL), Intel (NASDAQ:INTC) and others.
TSMC is actively expanding its production facilities and global presence, with offices in various countries such as China, India, Japan, South Korea, the Netherlands, and the USA.
In 2021, TSMC held a significant 52.1% share of the contract chip manufacturing market. Notably, in 2018, the company achieved a breakthrough by developing technologies for producing microchips with standards ranging from 90 to 5 nanometers.
As of the latest financial report on January 18, 2024, reflected in the earnings calendar, TSMC’s profit and revenue slightly declined but surpassed investors’ expectations. Annual revenue decreased by 1.5% to $19.62 billion, and net profit fell by 19.3% to $7.56 billion. Despite this, TSMC maintains its leadership in technological innovation, introducing new technologies and excelling in semiconductor manufacturing.
The company’s commitment to quality and reliability is reinforced by advanced technologies, experience, and streamlined automated processes that enhance production efficiency and reduce costs, ensuring competitiveness in the market.
Looking beyond the recent quarterly results, TSMC’s focus on future growth is evident through planned capital expenditures of $28 to $32 billion for the current year. The company aims to allocate funds for the operation of its plant in Japan, the construction of two factories in Arizona and one in Germany, and continued investment in advanced technological processes, mature technology, and packaging technologies.
Despite some challenges, TSMC’s positive trajectory is reflected in the chart, showing a 9.79% increase after the recent report.
Given the company’s ambitious plans for expansion and innovation, investors are keenly observing TSMC’s revenue prospects for 2024 and beyond. TSMC’s commitment to educational programs and research further supports analysts’ expectations of sustained growth. The company’s active role as a key player in the semiconductor market and its dedication to offering innovative solutions and improving product quality positions TSMC as a noteworthy entity in the stock market’s landscape.
The post Taiwan Semiconductor Stock: Evaluating Potential appeared first on PC Tech Magazine.
]]>The post 5 Best LGA 1150 CPUs for 2023: Ranked and Reviewed appeared first on PC Tech Magazine.
]]>The benefit of going with an older CPU is that you can complete an entire build for a fraction of the price of sourcing new components. So, after looking at numerous options, PC Tech’s top picks for the best LGA 1150 CPUs are:
The Intel Core i7-4790K was the flagship CPU of the Haswell generation. You’ve got four operational cores and eight processing threads at play. The CPU itself supported DDR3 RAM, with a maximum of 32 GB supported.
For gaming, the 4790K still has quite a bit of bite left in it. Modern games run just fine with a supported GPU. You might not be maxing out the likes of Starfield, and you may not be able to be a top gamer like PewDiePie or Bob Turvey, but there are plenty of other modern choices that run like a dream..
The 4790K has plenty of thermal headroom and can overclock. When paired with a compatible motherboard and ample cooling, this CPU is more than up to most modern tasks.
Pros
Cons
The Intel Core i7-4790 is functionally identical to the 4790K in most regards. It loses out on the ability to overclock, but you still have access to the same speedy four cores and eight threads of processing power.
It is a little slower than the 4790K, with a maximum clock frequency of 4.0 GHz compared to the base speed of 4.4 GHz on the k variant.
That said, you’ve got plenty to enjoy with the i7-4790. It does quite well with modern games, especially if you’ve maxed out the RAM and have a modern GPU. It does falter a bit with the likes of AI applications, especially when run natively. That is to be expected; the i7-4790 predates most popular AI frameworks by several years.
Pros
Cons
Intel’s i7-4770 is a great choice for builders on a budget. You can routinely find the i7-4770 for cheaper prices than the more deluxe 4790K and 4790. That said, you still have access to four relatively fast cores and eight threads of processing power.
The maximum clock frequency tops out at 3.9 GHz, even when factoring in Intel’s Turbo Boost. The i7-4770 will falter occasionally with more modern games. When paired with a modern GPU, it has more than enough power to handle most tasks but can bottleneck with CPU-bound games.
Pros
Cons
The Intel Xeon E3-1231V3B is a wolf in sheep’s clothing. While the Xeon line is particularly focused on the likes of workstations and servers, they can be leveraged into some fun use cases for your average user.
The Xeon E3-1231V3B has a particular edge because it supports expanded instruction sets. As this was an enterprise-grade CPU it comes with support for the likes of SSE4.1, SSE 4.2, and AVX2. You can still use this for modern gaming, but it will more than ably handle more robust audio and video uses.
It does have a lower maximum frequency than the previously mentioned CPUs, with a maximum speed of 3.8 GHz. However, if you’re looking to build a PC that can work and play with the best of them, it is hard to beat.
Pros
Cons
If you’re looking for blazing single-core performance, the Intel Core i5-4690 has you covered. This is the first and only CPU on this list that lacks eight processing threads. Instead, you’re left with just four cores and four threads.
Now, in 2023, the i5-4690 is going to be a poor partner for multi-threaded applications. However, when taking a look at processes and games that heavily lean on a single core, there is plenty to enjoy. The 4690 is a great CPU for gaming, especially when paired with a newer GPU.
You’ll have a less than enjoyable time when using this processor for multi-threaded tasks like AI training or video rendering.
However, if you’re just looking to game, the i5-4690 is a great choice.
Pros
Cons
To conclude, the most important consideration in picking one of the best LGA 1150 CPUs is just down to the cores and threads available. In 2023, cores and threads are crucial for most intensive computing tasks.
At the bare minimum when looking at an LGA 1150 CPU, you’ll want to make sure it has at least four cores.
This will give you more than enough power to handle most tasks. When paired with the maximum amount of RAM and a decent GPU, gaming should be a breeze. These are older CPUs, so you’ll need to temper your expectations.
Written and includes input from online sources
The post 5 Best LGA 1150 CPUs for 2023: Ranked and Reviewed appeared first on PC Tech Magazine.
]]>The post Why Businesses Should Embrace inq. Wi-Fi 6 and Edge Solutions appeared first on PC Tech Magazine.
]]>Designed to meet the evolving needs of businesses and government, the inq. Wi-Fi 6 products offer speed, capacity, and efficiency, transforming the way people experience wireless networks. With enhanced performance in crowded environments, improved latency, and support for multiple devices, the technology paves the way for uninterrupted streaming and efficient remote work collaboration.
Complementing the inq. Wi-Fi 6 offerings and its Edge solutions redefine the concept of edge computing, bringing processing and data storage closer to the source for real-time insights and improved time to decision-making. This empowers businesses across industries to harness the power of data, enhancing operational efficiency and enabling rapid innovation.
Key features of the inq. Wi-Fi 6 and Edge solutions include:
Here’s why businesses should consider embracing Wi-Fi 6 and Edge solutions:
1. Improved Performance: Wi-Fi 6 offers significantly faster data speeds compared to previous generations. This is especially important for businesses that rely on high-bandwidth applications like video conferencing, cloud computing, and data analytics. It can handle more devices simultaneously without sacrificing performance.
2. Increased Capacity: Wi-Fi 6 uses advanced technologies like Orthogonal Frequency Division Multiple Access (OFDMA) to efficiently divide the wireless channel into smaller sub-channels. This allows for more devices to connect simultaneously without causing network congestion, making it ideal for crowded business environments.
3. Better Coverage: With Wi-Fi 6, businesses can experience improved coverage and range. This means that fewer access points may be needed to cover a large area, reducing infrastructure costs and simplifying network management.
4. Enhanced Security: Wi-Fi 6 includes WPA3, the latest Wi-Fi security protocol, which offers stronger encryption and protection against brute-force attacks. This is crucial for businesses that handle sensitive data and need to maintain the confidentiality and integrity of their network.
5. Lower Latency: Lower latency in Wi-Fi 6 networks means that real-time applications like video conferencing, online gaming, and IoT devices can perform more efficiently. Reduced latency can improve user experiences and operational efficiency.
6. IoT Readiness: As the Internet of Things (IoT) continues to grow, Wi-Fi 6 provides the infrastructure necessary to support a multitude of connected devices, each with unique data and communication requirements. It’s better equipped to handle the diverse traffic generated by IoT devices.
7. Improved Battery Life: Wi-Fi 6 includes a feature called Target Wake Time (TWT), which allows devices to schedule when they wake up and communicate with the access point. This feature helps conserve battery life in devices, making it beneficial for businesses using battery-powered IoT sensors and devices.
8. Better Quality of Service (QoS): Wi-Fi 6 enables businesses to prioritize specific types of traffic, ensuring that critical applications receive the necessary bandwidth and low latency. This is essential for maintaining consistent performance across the network.
9. Future-Proofing: Investing in Wi-Fi 6 now can help future-proof your network for several years. As more Wi-Fi 6-capable devices enter the market, your business will be prepared to handle the increasing demand for high-speed and reliable wireless connectivity.
10. Competitive Advantage: Businesses that adopt the latest technology tend to stay ahead of the competition. Offering a fast, reliable, and secure Wi-Fi network can enhance customer experiences, employee productivity, and overall business operations, giving your company a competitive edge.
While Wi-Fi 6 offers many benefits, it’s essential to consider factors like budget, network design, and specific business needs before upgrading. Conducting a thorough assessment of your organization’s requirements and consulting with IT professionals can help determine if Wi-Fi 6 is the right solution for your business.
ALSO READ: CPU CORES EXPLAINED: THE HEART OF HIGH-PERFORMANCE COMPUTING
The post Why Businesses Should Embrace inq. Wi-Fi 6 and Edge Solutions appeared first on PC Tech Magazine.
]]>The post CPU Cores Explained: The Heart of High-Performance Computing appeared first on PC Tech Magazine.
]]>Central Processing Units (CPUs) are the brains of computers, responsible for executing instructions and performing calculations that drive your device’s functionality. CPU cores are at the heart of this operation.
A CPU core can be thought of as an independent processing unit, capable of executing instructions and performing tasks on its own. Imagine you are a chef in a bustling kitchen. In this analogy, each chef represents a CPU core. The more chefs you have, the more dishes you can prepare simultaneously.
Similarly, a CPU with multiple cores can handle multiple tasks concurrently, a concept referred to as multitasking. Each core operates independently, executing instructions for different tasks without waiting for one to finish before starting the next.
When you run a computer program or perform tasks like browsing the internet, watching videos, and editing documents simultaneously, each task can be assigned to a separate core. As a result, the overall performance is improved, and the computer operates more efficiently. For instance, while one core handles video rendering, another can process user inputs, and yet another can run background tasks like updates or antivirus scans.
Different applications have varying demands on a CPU. Some tasks, like word processing or basic web browsing, don’t require a significant amount of processing power. On the other hand, tasks like video editing, 3D rendering, and gaming demand substantial computational resources.
Think of CPU cores as a team of workers in a factory. In a factory that produces intricate machinery, you need specialized workers with specific skills to handle different assembly stages. Similarly, applications benefit from having the right number of CPU cores at their disposal. A video editing software, for example, thrives when it can distribute the workload across multiple cores. This allows for faster rendering times and a smoother editing experience.
CPU cores are just one aspect of a processor’s architecture. Another crucial factor is the clock speed, measured in gigahertz (GHz). Clock speed determines how many instructions a core can execute per second. While it might be tempting to believe that a higher clock speed always translates to better performance, the relationship between core count and clock speed is a delicate balance.
Navigating the world of CPUs involves considering the intricate interplay between core count and clock speed. The market offers a variety of CPU designs, each tailored to specific use cases. Understanding these designs can help you make informed decisions based on your computing needs.
It’s crucial to align your choice with the specific use case you have in mind. Here’s a breakdown of different CPU core configurations and where they fit best:
These CPUs are the backbone of basic work computers. They are designed for tasks like word processing, web browsing, and email. Their low-to-moderate clock speeds ensure energy efficiency and smooth operation for everyday computing tasks.
Dual-core CPUs have two independent cores, while quad-core CPUs feature four. They typically operate with clock speeds ranging from 2.0 to 3.5 GHz. These clock speeds are chosen for energy efficiency and to provide adequate performance for everyday tasks.
Ideal for gaming computers, these CPUs strike a balance between core count and clock speed. Gaming demands multitasking capabilities to run the game itself, alongside background tasks like streaming or voice chat.
Hexa-core CPUs boast six cores, while octa-core CPUs pack eight. Their clock speeds usually range from 3.0 to 4.5 GHz. These CPUs strike a balance between the number of cores and the clock speed to ensure a seamless gaming experience.
These cores are like a team of synchronized athletes, ensuring the game runs smoothly while accommodating additional tasks without compromising performance.
A haven for content creators and designers, these CPUs offer high core counts and enhanced clock speeds.
Applications like video editing, 3D rendering, and graphic design heavily rely on multiple cores to expedite complex computations.
Octa-core CPUs contain eight cores, while the more advanced deca-core CPUs house ten. Their clock speeds often fall within the 3.0 to 4.5 GHz range. The combination of numerous cores and respectable clock speeds accelerates processes like video editing and rendering.
These cores resemble a dedicated assembly line, each core contributing to the creative process, resulting in quicker rendering and improved workflow.
Database administrators benefit from these CPUs, which strike a balance between core count and clock speed. Managing databases involves intricate calculations and data manipulation, making a combination of cores and clock speed essential. These cores act as skillful data analysts, efficiently managing and optimizing databases while maintaining responsiveness for administrative tasks.
Selecting the appropriate CPU configuration involves understanding your computing needs and matching them with the right core count and clock speed. Just as a chef selects the finest ingredients for a recipe, you can curate your computing experience by selecting a CPU that aligns perfectly with your intended tasks.
Whether you are crunching numbers in a spreadsheet, immersing yourself in the world of virtual gaming, bringing digital art to life, or managing complex databases, the realm of CPU cores has a solution tailored to your requirements.
By considering the interplay between core count and clock speed, you’re equipped to navigate the CPU market with confidence and make choices that optimize your computing experience.
The post CPU Cores Explained: The Heart of High-Performance Computing appeared first on PC Tech Magazine.
]]>The post The State of Liquid Cooled PCs in 2022 appeared first on PC Tech Magazine.
]]>Join our time travel train as we go back to the year 2017. The Nvidia RTX 2080 was still a dream that would not come to life until November 2018. Unless you were into crypto mining or servers, there was really nothing you could do on a PC that needed the advantage liquid coolants brought.
With the launch of the RTX series, you could almost feel the shift in gears, almost as if we collectively said “now you’re talking”. Originally, only true enthusiasts could be bothered with setting up a liquid cooling system, it’s expensive and also requires a lot of technical knowledge.
The streaming community soon put a stop to that. Famous gamers showing off their RGB setups got the love of liquid cooling systems to any willing disciple. Now the time-travel train is back in 2022 when manufacturers are rolling out products in a frenzy to meet this demand.
With the help of social media, it’s normal to think liquid cooling is a fairly new concept; the reverse is the case. Let’s go way back to 1964, this was when IBM launched the first liquid cooling system.
During this period, processors and other components were not nearly as efficient as they are now; this meant they generated a lot of heat, too much heat for the best fans to handle. Liquid cooling was the only way to handle heat transfer. IBM’s early system was bulky, but that is to be expected at that time. It worked by pumping cold water with the help of a liquid-to-liquid exchanger to the components that needed to be cooled.
Over the years, humans improved on the idea as we tend to do; this heralded the use of CMOS technology. With the adoption of this technology, the energy required to run certain components was drastically reduced, and this also reduced the heat output. If it weren’t for the use of CMOS, air cooling methods might have never had the edge they needed to be in the same conversation with liquid cooling methods.
In the year 2000, we saw the release of a hybrid cooling system by the Fujitsu GS8900 mainframe. The way it worked was pretty straightforward. First off, cold plates (material made of a heat-conductive metal) were attached to all the processor modules. Then an intricate connection of refrigerator pipes was attached to each cold plate. As you might have guessed, water was pumped through this system to cool the processors down.
With each year, humans made progress in the advancement of liquid cooling systems. Let’s skip the minor changes and go to the year 2012. This is the year we saw the use of two-phase immersion cooling systems. For once, gamers were not the ones cleaning out the shelf with this product. It was crypto miners, our neighborhood friends responsible for your inability to find an RTX 2060 online.
In 2022, unfortunately, a computer case with built-in liquid cooling is still not a thing, it seems like we unanimously decided the hybrid liquid system is the most efficient. Our predecessors could be seen using refrigerated water, yes, actual freezing water to cool things down. Now we utilize a radiator along with a fan, a pump, and connecting pipes to achieve the same result. It’s great to see how far we’ve come.
We did say humans have come very far, who would’ve thought a liquid-cooled laptop was a remote possibility? A laptop is meant to be compact and portable, right now it’s still not possible to have the full liquid cooling setup inside (i.e, pump, fan, pipes, etc). But that doesn’t mean we can’t have our cooling system outside.
This is what brands like XMG and Powertracer have been able to achieve. Believe us when we say an external liquid cooling system is still a great feat of human ingenuity. Even though the main liquid cooler housing is external, water still enters the laptop, think about this before you get any ideas. Right now only laptops like the CyberPowerPC Tracer VI Edge Pro Liquid Cool and the XMG NEO 15 and so on; have the internal design to work with such methods.
It does promise to be very efficient, if you own a gaming laptop, you know that temperatures of 90℃ are easily reached under heavy gaming. A product like the XMG NEO 15 when paired with its Oasis cooling block, records levels in the 60℃ range. That’s the internet surfing temperature for a gaming laptop user.
If you’ve modified your PC build in the last 6 months, then you know that the prices of GPU and even liquid cooling systems are not what they used to be. From the year 2018 to 2022, we saw GPU prices move from $400 to $700+. Admittedly, the economy has not been great worldwide, but that’s not the only reason
It was still during 2018/2019 that the numbers of ETH and BTC miners surged. If there is one thing miners need, it’s the processing power of the best graphic cards out there. If this was the only issue, perhaps the price levels now won’t be so bad. Covid came and made things worse, many companies shut down, and we are sure you started working from home too.
ALSO READ: THE BASICS OF BITCOIN INVESTING
Unfortunately, graphic cards can’t be made from home, and supply issues became imminent. The basic economy principle tells us that when demand increases, supply reduces, and price increases. Major graphic card producers were no longer making new products, the ones that did, were not at their pushing enough pout to satisfy the consumers.
That is how it all trickled down to you in your room, just trying to order an RTX 3060 Ti from Amazon. As the pandemic gradually seizes, and we recover from its economic implications, prices should start dropping soon.
ALSO READ: REASONS WHY PC BUILDS ARE BETTER THAN GAMING LAPTOPS
The post The State of Liquid Cooled PCs in 2022 appeared first on PC Tech Magazine.
]]>The post Drones and Drone Technology: What Are The Benefits? appeared first on PC Tech Magazine.
]]>Drones are helpful in industrial settings for a variety of tasks, such as surveying disaster zones or inspecting hard-to-reach infrastructure. Drones used for inspections can provide you with real-time video and data from almost impossible-to-reach viewpoints and locations, including your worksite and equipment. They can also be used to monitor crops or assess damage after a natural disaster.
Similarly, drones also offer a new perspective, literally. Their ability to fly gives them a unique vantage point that can be used for things like surveying land or taking aerial photographs. This can provide valuable insights that would otherwise be unavailable.
This is one of the most obvious benefits of drones. If there is an area that is difficult or dangerous for humans to access, then a drone can be used instead. Drones can be used for a variety of dangerous or difficult jobs, such as inspecting power lines or surveying disaster areas. They can also be used to drop supplies or rescue people in difficult-to-reach areas.
Wars in the modern age are not what they used to be back even only a few decades ago. Modern-day wars are fought with sophisticated weapons and technology. Drones can be used for various purposes in wars, such as reconnaissance, attack missions, and delivering supplies. Reconnaissance drones can be used to gather information about the enemy’s position and movements. This information can be used to plan military operations and choose the best course of action. Attack drones can be used to carry out precision strikes against enemy targets. This can help reduce collateral damage and increase the chances of success for a mission.
Supply drones can be used to deliver food, water, and other supplies to troops in the field. This can help reduce the need for resupply missions, which are often dangerous. Drones can also be used for other purposes, such as carrying out surveillance of a battlefield or providing information about the enemy activity to troops.
Another big benefit of drones is that they can often be cheaper and easier to deploy than traditional methods. For example, it can be expensive and time-consuming to set up a manned aircraft or ground-based camera system. But a drone can be quickly deployed and operated at a fraction of the cost.
Another big advantage of drones is that they can provide real-time data and images. Drones are able to provide real-time data and images by using a variety of sensors and cameras. These sensors and cameras can be used to collect data about the environment, such as temperature, humidity, wind speed, and so on. The data collected by the drones can then be transmitted back to a base station in real-time, allowing for monitoring of the environment in near-real time.
Additionally, the use of cameras aboard drones allows for visual monitoring of an area. This is extremely useful for tasks such as monitoring traffic and crops or assessing damage after a natural disaster. It also allows for faster decision-making as information can be gathered and analyzed in near-real-time.
Finally, drones are also environmentally friendly. They don’t produce any emissions and their small size means they have a very small environmental footprint. This is an important consideration for industries that are looking to reduce their impact on the planet.
Overall, there are many benefits to using drone technology. These are just some of the key advantages that make drones so useful for a variety of applications. Drones offer a unique perspective and can be used for things like surveying land or taking aerial photographs. They are also environmentally friendly and have a very small environmental footprint. Additionally, drones can be used for dangerous or difficult jobs, such as inspecting power lines or surveying disaster areas.
Finally, drones can provide real-time data and images by using a variety of sensors and cameras. Drones really are an essential part of modern technology and are no doubt here to stay.
ALSO READ: CAA GUIDELINES ON IMPORTATION AND OPERATION OF DRONES IN UGANDA
The post Drones and Drone Technology: What Are The Benefits? appeared first on PC Tech Magazine.
]]>The post Samsung Reveals High-Performance PCIe 5.0 SSD for Enterprise Servers with speed of upto 13GBps appeared first on PC Tech Magazine.
]]>Compared to PCIe 4.0 NVMe storage devices, Samsung claims its PM1743 can provide a bandwidth of 32GT/s — doubling what current PCIe 4.0 drives can offer.
The chipmaker also said its new PM1743 can provide a 2,500K IOPS random read speed, which combined with its 13,000 MB/s read speed offers 1.9x and 1.7x faster speeds compared to PCIe 4.0 drivers.
“To reach such speeds, Samsung had to develop a proprietary controller for its PM1743 device. The chipmaker then enlisted the help of Intel to test the technology,” according to Intel’s director of technology initiatives, Jim Pappas.
“Intel has been working with Samsung to test Samsung’s newest PCIe NVMe SSD, the PM1743,” Pappas said, adding “together, we have jointly resolved complicated technical issues encountered with PCIe 5.0 during this initial evaluation period.”
The PM1743 offers increased write speeds of 6.6GBps and a random write speed of 250K IOPS, 1.7x and 1.9x faster compared to PCIe 4.0.
Samsung also claims its PM1743 can provide improved power efficiency compared to PCIe 4.0 with 608MBps per watt, a 30 percent increase that the chipmaker believes will lower server and operating costs as well as a data center’s carbon footprint.
The chipmaker plans to mass produce its PM1743 in first quarter of 2022, though it has begun doling out samples of the PM1743 for joint system development.
The storage device will be available in several capacities, including 1.92TB to 15.36TB. The chipmaker also claims the PM1743 will be the industry’s first PCIe 5.0 SSD that provides dual-port support in the event of a single port failure.
The PM1743 also promises to address data security issues that continue to permeate the enterprise server market.
“By embedding a security processor and root of trust (RoT), the SSD will protect against security threats and data forgery to provide data confidentiality and integrity, while also enabling Secure Boot in server systems through attestation,” Samsung said in its press release.
Samsung, however, isn’t the only company to dabble in PCIe 5.0. Adata, a fabless Taiwanese memory and storage manufacturer, plans to showcase a few prototype PCIe 5.0 SSDs at CES 2022.
These include Adata’s Project Nighthawk and Project Blackbird SSDs, which use a PCIe 5.0 x4 interface paired with the NVMe 2.0 protocol.
Kioxia, formerly Toshiba Memory, also announced a few months back a prototype of its CD7 Series, which is designed with the PCIe 5.0 interface.
The post Samsung Reveals High-Performance PCIe 5.0 SSD for Enterprise Servers with speed of upto 13GBps appeared first on PC Tech Magazine.
]]>The post Windows 11 Compatibility Check: How to Know If Your Laptop or PC is Eligible for Upgrade appeared first on PC Tech Magazine.
]]>After officially announcing the OS at a virtual event hosted by Chief Product Officer Panos Panay, the company did not mention when the new OS would be available for its consumers. Therefore, there’s no official public release date for Windows 11 yet, but all signs are pointing to October 2021. This also means that if the OS is in fact released in October, retailers will likely start selling computers with Windows 11 installed.
Microsoft has promised to make the OS available as a free upgrade in early 2022 to all existing Windows 10 users.
Although Microsoft will be offering Windows 11 as a free upgrade for devices already running Windows 10, this does not mean that your computer hardware configuration will be compatible. In addition to requiring a trusted platform module (TPM) chip, the device will also need to have one of the supported processors.
As part of the new minimum system requirements, Windows 11 will be supported only on 64-bit (x64) processors and only in specific chips from Intel, AMD, and Qualcomm, leaving a lot of older computers without the possibility to upgrade. For example, from Intel, the new version will officially support only the 8th Gen and newer processors and some of the Celeron, Atom, Pentium, and Xeon chips. And from AMD, Windows 11 will support third-generation Ryzen and newer processors, including some second-generation Ryzen 7 CPUs and some Athlon and EPYC processors.
To check whether your processor is supported for the upgrade, there are several quick ways on Windows 10 to confirm if it’s on the list of supported hardware using the Settings app, Command Prompt, or the PC Health Check app.
How to check CPU compatibility using Settings;
How to check CPU compatibility using commands;
wmic computersystem get systemtype
Alternatively, you can also use the Windows PC Health Check app to test whether your PC meets the system requirements for Windows 11, but unfortunately, the app was temporarily removed because many people reported that the app lacked sufficient information. Microsoft said in a blog post that it plans to address the feedback and get it back online sometime before Windows 11 becomes generally available in the fall.
The processor is one of the most consider requirements for a computer/laptop to be eligible for Windows 11 Upgrade. All that is why we looked at it first (as mentioned above). However, to download Windows 11 on your PC or laptop, it must meet the requirements below;
Component | Minimum |
---|---|
Processor | A compatible 64-bit processor (x86-64 or ARM64) with at least 1 GHz clock rate and at least 2 cores |
Memory (RAM) | At least 4 GB |
Storage space | At least 64 GB |
System firmware | UEFI |
Security | Secure Boot, enabled by default |
Trusted Platform Module (TPM) version 2.0 | |
Graphics card | Compatible with DirectX 12 or later with WDDM 2.0 driver |
Display | High definition (720p) display that is greater than 9” diagonally, 8 bits per color channel |
Internet connection and Microsoft accounts | Internet connection and Microsoft account required to complete the first-time setup on Windows 11 Home. |
Feature | Requirements |
---|---|
5G support | 5G capable modem |
Auto HDR | HDR-capable monitor |
Biometric authentication and Windows Hello | Illuminated infrared camera or fingerprint reader |
BitLocker to Go | USB flash drive (available in Windows 11 Pro and higher editions) |
Hyper-V | Second Level Address Translation (SLAT) |
DirectStorage | NVMe Solid-state drive and a DirectX 12 graphics card with Shader Model 6.0 |
DirectX 12 Ultimate | Available with supported games and graphics cards |
Spatial sound | Supporting hardware and software |
Two-factor authentication | Use of PIN, biometric authentication, or a phone with Wi-Fi or Bluetooth capabilities |
Speech recognition | Microphone |
Wi-Fi 6E support | New WLAN IHV hardware and driver, Wi-Fi 6E capable AP/router |
Windows Projection | Wi-Fi adapter that supports Wi-Fi Direct, WDDM 2.0 |
Microsoft says Windows 10 will be retired in 2025 to give room for Windows 11 —this comes six years after Microsoft last overhauled its operating system with Windows 10, a major update that’s now running on around 1.3 billion devices worldwide, according to CCS Insight.
The post Windows 11 Compatibility Check: How to Know If Your Laptop or PC is Eligible for Upgrade appeared first on PC Tech Magazine.
]]>