Ati X1600 Linux Driver
- Linux Driver Download
- Ati X1600 Linux Drivers
- Ati X1600 Linux Driver Updater
- Ati X1600 Linux Driver Installer
Remove any previus Ati Radeon HD X1600 driver from Windows control panel and reboot the system. Windows now detects Ati Radeon HD X1600 card and searchs for a driver: you must stop any request and hit on 'Cancel button'. Double click on Setup.exe (or similar file xxx.exe) located in c: Radeon HD X1600 folder; Follow the on screen istructions. AMD graphic cards are well supported on Ubuntu 20.04 Focal Fossa. The default open source AMD Radeon Driver is installed and enabled by default out of the box. However, since the Ubuntu 20.04 is a long term support (LTS) release the AMD Radeon graphic card users have few AMD Radeon driver installation options to their disposal. Downloads 21 Drivers for Ati Radeon X1600 graphics. Here's where you can download the newest software for your Radeon X1600.
After a week of tearing my hair out and reading threads on this forum and others, I was able to get Mint 18 working on a late 2006 iMac. Everything minus the iSight camera works, but that's not a problem. What IS a problem is heat generation. I know that it is commonly known that Mint/Ubuntu 'run hot' on Macs for whatever reason, and usually people fight this by using some sort of fan control. While this may remedy the heat and eventually wear out fans that weren't designed to run all of the time, it doesn't address the actual issue. I also considered going back to 17, but this heat issue has been around since 13, from what I've read. Perhaps even further.
I seem to have narrowed the problem (at least on my machine) to the graphics. The system is a Core 2 Duo 2.16. Viewing System Monitor 'resources' tab - both cores are at 10 - 15% usage at idle. Moving the mouse increases both cores by 5%. Dragging a window across the screen brings both cores to 65%. Scrolling a window in FireFox brings both cores to 100% momentarily. System Monitor 'processes' tab doesn't show ANY CPU usage hardly at all, with System Monitor bringing the lead at 5%.
Symptoms - after about 1-2 hours of use, the video gets 'flaky'. Some fonts look weird, random lines and dots on the screen, etc. Reaching behind the iMac where the vent is feels like heat that would be sufficient to cook an egg.
Graphics - ATI Radeon x1600 mobility with 512k VRAM. The iMacs internally are a laptop configuration. Of course the logic board is made specifically for the iMac, but the 'flavor' of the setup is basically a laptop. Most Linux distros treat it as such. Is there a different driver that I can try? The one that is installed defaulty works wonderful, but something about this particular configuration doesn't seem 100% compatible with this graphics chipset. Is there somewhere else I can check for CPU usage? How can I try and narrow this down? Perhaps I'm using the regular x1600 drivers and not the mobility drivers? Would there be a difference? How can I check?
I also realize that Gnome may just be too much for this old machine to handle, but I think a Core 2 Duo with 3GB ram should be able to operate within this environment pretty easy. I'm not trying to play games or process videos and graphics. I just want to browse the web and do email. Any suggestions would be appreciated.
This article covers the radeon open source driver which supports the majority of AMD (previously ATI) GPUs.
Selecting the right driver
Depending on the card you have, find the right driver in Xorg#AMD. This page has instructions for ATI.
If unsure, try this open source driver first, it will suit most needs and is generally less problematic. See the feature matrix to know what is supported and the decoder ring to translate marketing names (e.g. Radeon HD4330) to chip names (e.g. R700).
Installation
Install the mesa package, which provides the DRI driver for 3D acceleration.
- For 32-bit application support, also install the lib32-mesa package from the multilib repostory.
- For the DDX driver (which provides 2D acceleration in Xorg), install the xf86-video-ati package.
Support for accelerated video decoding is provided by mesa-vdpau and lib32-mesa-vdpau packages.
Loading
The radeon kernel module should load fine automatically on system boot.
If it does not happen, then:
- Make sure you do not have
nomodeset
orvga=
as a kernel parameter, since radeon requires KMS. - Also, check that you have not disabled radeon by using any kernel module blacklisting.
Enable early KMS
See Kernel mode setting#Early KMS start.
Xorg configuration
Xorg will automatically load the driver and it will use your monitor's EDID to set the native resolution. Configuration is only required for tuning the driver.
If you want manual configuration, create /etc/X11/xorg.conf.d/20-radeon.conf
, and add the following:
Using this section, you can enable features and tweak the driver settings.
Performance tuning
Enabling video acceleration
See Hardware video acceleration.
Graphical tools
- WattmanGTK — A GTK3 user-interface written in Python 3, which allows you to view, monitor Radeon performance, fan speeds and power states and the ability to overclock the graphics processor. It uses the AMDGPU kernel driver.
amdgpu.ppfeaturemask
) in order to enable the AMD Overdrive technology within GNU/Linux. Which is necessary to use WattmanGTK.- radeon-profile — Qt application for displaying info about a Radeon card.
- https://github.com/marazmista/radeon-profile || radeon-profile-gitAUR
Driver options
The following options apply to /etc/X11/xorg.conf.d/20-radeon.conf
.
Please read radeon(4) and RadeonFeature first before applying driver options.
Acceleration architecture; Glamor is available as a 2D acceleration method implemented through OpenGL, and it is the default for R600 (Radeon HD2000 series) and newer graphic cards. Older cards use EXA.
DRI3 is enabled by default since xf86-video-ati 7.8.0. For older drivers, which use DRI2 by default, switch to DRI3 with the following option:
TearFree is a tearing prevention option which prevents tearing by using the hardware page flipping mechanism:
ColorTiling and ColorTiling2D are supposed to be enabled by default. Tiled mode can provide significant performance benefits with 3D applications. It is disabled if the DRM module is too old or if the current display configuration does not support it. KMS ColorTiling2D is only supported on R600 (Radeon HD2000 series) and newer chips:
When using Glamor as acceleration architecture, it is possible to enable the ShadowPrimary option, which enables a so-called 'shadow primary' buffer for fast CPU access to pixel data, and separate scanout buffers for each display controller (CRTC). This may improve performance for some 2D workloads, potentially at the expense of other (e.g. 3D, video) workloads. Note that enabling this option currently disables Option 'EnablePageFlip':
EXAVSync is only available when using EXA and can be enabled to avoid tearing by stalling the engine until the display controller has passed the destination region. It reduces tearing at the cost of performance and has been known to cause instability on some chips:
Below is a sample configuration file of /etc/X11/xorg.conf.d/20-radeon.conf
:
Kernel parameters
systool
as stated in Kernel modules#Obtaining information.Defining the gartsize, if not autodetected, can be done by adding radeon.gartsize=32
as a kernel parameter.
The changes take effect at the next reboot.
Deactivating PCIe 2.0
Since kernel 3.6, PCI Express 2.0 in radeon is turned on by default.
It may be unstable with some motherboards. It can be deactivated by adding radeon.pcie_gen2=0
as a kernel parameter.
See Phoronix article for more information.
Gallium Heads-Up Display
The radeon driver supports the activation of a heads-up display (HUD) which can draw transparent graphs and text on top of applications that are rendering, such as games. These can show values such as the current frame rate or the CPU load for each CPU core or an average of all of them. The HUD is controlled by the GALLIUM_HUD environment variable, and can be passed the following list of parameters among others:
- 'fps' - displays current frames per second
- 'cpu' - displays the average CPU load
- 'cpu0' - displays the CPU load for the first CPU core
- 'cpu0+cpu1' - displays the CPU load for the first two CPU cores
- 'draw-calls' - displays how many times each material in an object is drawn to the screen
- 'requested-VRAM' - displays how much VRAM is being used on the GPU
- 'pixels-rendered' - displays how many pixels are being displayed
To see a full list of parameters, as well as some notes on operating GALLIUM_HUD, you can also pass the 'help' parameter to a simple application such as glxgears and see the corresponding terminal output:
More information can be found from this mailing list post or this blog post.
Hybrid graphics/AMD Dynamic Switchable Graphics
It is the technology used on recent laptops equiped with two GPUs, one power-efficent (generally Intel integrated card) and one more powerful and more power-hungry (generally Radeon or Nvidia). There are two ways to get it work:
- If it is not required to run 'GPU-hungry' applications, it is possible to disable the discrete card (see Ubuntu wiki):
echo OFF > /sys/kernel/debug/vgaswitcheroo/switch
. - PRIME: Is a proper way to use hybrid graphics on Linux, but still requires a bit of manual intervention from the user.
Powersaving
With the radeon driver, power saving is disabled by default and has to be enabled manually if desired.
You can choose between three different methods:
- dpm (enabled by default since kernel 3.13)
See https://www.x.org/wiki/RadeonFeature/#index3h2 for more details.
Dynamic power management
Since kernel 3.13, DPM is enabled by default for lots of AMD Radeon hardware. If you want to disable it, add the parameter radeon.dpm=0
to the kernel parameters.
radeon.dpm=1
kernel parameter will enable dpm.Unlike dynpm, the 'dpm' method uses hardware on the GPU to dynamically change the clocks and voltage based on GPU load. It also enables clock and power gating.
There are 3 operation modes to choose from:
battery
lowest power consumptionbalanced
sane defaultperformance
highest performance
They can be changed via sysfs
For testing or debugging purposes, you can force the card to run in a set performance mode:
auto
default; uses all levels in the power statelow
enforces the lowest performance levelhigh
enforces the highest performance level
Commandline Tools
- radcard - A script to get and set DPM power states and levels
Old methods
Dynamic frequency switching
This method dynamically changes the frequency depending on GPU load, so performance is ramped up when running GPU intensive apps, and ramped down when the GPU is idle. The re-clocking is attempted during vertical blanking periods, but due to the timing of the re-clocking functions, does not always complete in the blanking period, which can lead to flicker in the display. Due to this, dynpm only works when a single head is active.
It can be activated by simply running the following command:
Profile-based frequency switching
This method will allow you to select one of the five profiles (described below). Different profiles, for the most part, end up changing the frequency/voltage of the GPU. This method is not as aggressive, but is more stable and flicker free and works with multiple heads active.
To activate the method, run the following command:
Select one of the available profiles:
default
uses the default clocks and does not change the power state. This is the default behaviour.auto
selects betweenmid
andhigh
power states based on the whether the system is on battery power or not.low
forces the gpu to be in thelow
power state all the time. Note thatlow
can cause display problems on some laptops, which is whyauto
only useslow
when monitors are off. Selected on other profiles when the monitors are in the DPMS-off state.mid
forces the gpu to be in themid
power state all the time.high
forces the gpu to be in thehigh
power state all the time.
As an example, we will activate the low
profile (replace low
with any of the aforementioned profiles as necessary):
Persistent configuration
The methods described above are not persistent. To make them persistent, you may create a udev rule (example for #Profile-based frequency switching):
As another example, dynamic power management can be permanently forced to a certain performance level:
To determine the KERNEL
name execute:
Other notes
To view the speed that the GPU is running at, perform the following command and you will get something like this output:
It depends on which GPU line yours is, however. Along with the radeon driver versions, kernel versions, etc. So it may not have much/any voltage regulation at all.
Thermal sensors are implemented via external i2c chips or via the internal thermal sensor (rv6xx-evergreen only). To get the temperature on asics that use i2c chips, you need to load the appropriate hwmon driver for the sensor used on your board (lm63, lm64, etc.). The drm will attempt to load the appropriate hwmon driver. On boards that use the internal thermal sensor, the drm will set up the hwmon interface automatically. When the appropriate driver is loaded, the temperatures can be accessed via lm_sensors tools or via sysfs in /sys/class/hwmon
.
Fan Speed
While the power saving features above should handle fan speeds quite well, some cards may still be too noisy in their idle state. In this case, and when your card supports it, you can change the fan speed manually.
Warning:- Keep in mind that the following method sets the fan speed to a fixed value, hence it will not adjust with the stress of the GPU, which can lead to overheating under heavy load.
- Check GPU temperature when applying lower than standard values.
Linux Driver Download
To control the GPU fan, see Fan speed control#AMDGPU sysfs fan control (amdgpu and radeon share the same controls for this).
For persistence, see the example in #Persistent configuration.
If a fixed value is not desired, there are possibilities to define a custom fan curve manually by, for example, writing a script in which fan speeds are set depending on the current temperature (current value in /sys/class/drm/card0/device/hwmon/hwmon0/temp1_input
).
A GUI solution is available by installing radeon-profile-gitAUR.
TV out
First, check that you have an S-video output: xrandr
should give you something like
Now we should tell Xorg that it is actually connected (it is, right?)
Setting TV standard to use:
Adding a mode for it (currently supports only 800x600):
Clone mode:
Now let us try to see what we have:
At this point you should see a 800x600 version of your desktop on your TV.
To disable the output, do
Force TV-out in KMS
The kernel can recognize video=
parameter in following form (see KMS for more details):
For example:
Parameters with whitespaces must be quoted:
Current mkinitcpio implementation also requires #
in front. For example:
- GRUB Legacy can pass such command line as is.
- LILO needs backslashes for doublequotes (append
# 'video=9-pin DIN-1:1024x768-24@60e'
)
You can get list of your video outputs with following command:
HDMI audio
HDMI audio is supported in the xf86-video-ati video driver. To disable HDMI audio add radeon.audio=0
to your kernel parameters.
If there is no video after boot up, the driver option has to be disabled.
Note:- If HDMI audio does not work after installing the driver, test your setup with the procedure at Advanced Linux Sound Architecture/Troubleshooting#HDMI Output does not work.
- If the sound is distorted in PulseAudio try setting
tsched=0
as described in PulseAudio/Troubleshooting#Glitches, skips or crackling and make surertkit
daemon is running. - Your sound card might use the same module, since HDA compliant hardware is pretty common. Advanced Linux Sound Architecture#Set the default sound card using one of the suggested methods, which include using the
defaults
node in alsa configuration.
Multihead setup
Using the RandR extension
See Multihead#RandR how to setup multiple monitors by using RandR.
Independent X screens
Independent dual-headed setups can be configured the usual way. However you might want to know that the radeon driver has a 'ZaphodHeads'
option which allows you to bind a specific device section to an output of your choice:
This can be a life-saver, when using videocards that have more than two outputs. For instance one HDMI out, one DVI, one VGA, will only select and use HDMI+DVI outputs for the dual-head setup, unless you explicitly specify 'ZaphodHeads' 'VGA-0'
.
Turn vsync off
The radeon driver will probably enable vsync by default, which is perfectly fine except for benchmarking. To turn it off try the vblank_mode=0
environment variable or create ~/.drirc
(edit it if it already exists) and add the following:
If vsync is still enabled, you can disable it by editing /etc/X11/xorg.conf.d/20-radeon.conf
. See #Driver options.
Troubleshooting
Performance and/or artifacts issues when using EXA
If having 2D performance issues, like slow scrolling in a terminal or webbrowser, adding Option 'MigrationHeuristic' 'greedy'
as device option may solve the issue.
In addition disabling EXAPixmaps may solve artifacts issues, although this is generally not recommended and may cause other issues.
Adding undetected/unsupported resolutions
See Xrandr#Adding undetected resolutions.
TV showing a black border around the screen
When connecting a TV using the HDMI port, the TV may show a blurry picture with a 2-3cm border around it. This protects against overscanning (see Wikipedia:Overscan), but can be turned off using xrandr:
Black screen and no console, but X works in KMS
This is a solution to the no-console problem that might come up, when using two or more ATI cards on the same PC. Fujitsu Siemens Amilo PA 3553 laptop for example has this problem. This is due to fbcon console driver mapping itself to the wrong framebuffer device that exists on the wrong card. This can be fixed by adding this to the kernel boot line:
This will tell the fbcon to map itself to the /dev/fb1
framebuffer dev and not the /dev/fb0
, that in our case exists on the wrong graphics card. If that does not fix your problem, try booting with
instead.
ATI X1600 (RV530 series) 3D application show black windows
There are three possible solutions:
- Try adding
pci=nomsi
to your boot loader Kernel parameters. - If this does not work, you can try adding
noapic
instead ofpci=nomsi
. - If none of the above work, then you can try running
vblank_mode=0 glxgears
orvblank_mode=1 glxgears
to see which one works for you, then install driconfAUR and set that option in~/.drirc
.
Cursor corruption after coming out of sleep
If the cursor becomes corrupted like it's repeating itself vertically after the monitor(s) comes out of sleep, set 'SWCursor' 'True'
in the 'Device'
section of the /etc/X11/xorg.conf.d/20-radeon.conf
configuration file.
DisplayPort stays black on multimonitor mode
Try booting with the kernel parameterradeon.audio=0
.
R9-390 Poor Performance and/or Instability
Ati X1600 Linux Drivers
Firmware issues with R9-390 series cards include poor performance and crashes (frequently caused by gaming or using Google Maps) possibly related DPM. Comment 115 of this bug report includes instructions for a fix.
QHD / UHD / 4k support over HDMI for older Radeon cards
Older cards have their pixel clock limited to 165MHz for HDMI. Hence, they do not support QHD or 4k only via dual-link DVI but not over HDMI.
One possibility to work around this is to use custom modes with lower refresh rate, e.g. 30Hz.
Another one is a kernel patch removing the pixel clock limit, but this may damage the card!
Official kernel bug ticket with patch for 4.8: https://bugzilla.kernel.org/show_bug.cgi?id=172421
The patch introduces a new kernel parameter radeon.hdmimhz
which alters the pixel clock limit.
Be sure to use a high speed HDMI cable for this.
Horizontal flickering occasionally when using 4k DP output on 390X
If you use 390X (or perhaps similar models) and the 4k output from DP, you may experiencing occasional horizontal artifacts / flickering. It's like every half an hour or so, a horizontal strip of pixels (with a height of ~100 pixels and runs across the whole screen's width) shaking up and down for a few seconds. This might be a bug of radeon driver. Changing to AMDGPU seems to fix it.
Ati X1600 Linux Driver Updater
See also
Ati X1600 Linux Driver Installer
Benchmark showing the open source driver is on par performance-wise with the proprietary driver for many cards.