[SOLVED] How do you input monitor details in xorg.conf.d/ ?

I've recently installed Arch on an HP box - quad core i3 @ 3.1GHz, 4GB RAM, Radeon HD 6450 GPU / 1GB RAM;- it is using a Samsung SyncMaster 2443 24" monitor.
[edit:] Using the open source xf86-video-ati driver stack. /edit
The problem I have is that the X server thinks that the monitor isn't capable of handling more than vesa - in this case 1024x768 res'.
If I change xorg.conf.d/20-gpudriver.conf  from vesa to ati, or create a .../20-radeondriver.conf with radeon in it the X server fails to start, stating that the screen found was unsuitable?
So, how do I get around this? This monitor was previously being used on an AGP nVidia GeForce 7950GT very happily at 1920x1200.
I've searched & not come up with info' on this. The data on the X.org wiki is out of date in this regard.
Thanks for your time.
Last edited by handy (2011-11-12 01:53:03)

I just ran the following command:
sudo hwinfo --framebuffer | less
{I didn't really need to pipe the output to less, but if you don't use the --framebuffer option you certainly do (well I did anyway).}
Which gives me this output:
02: None 00.0: 11001 VESA Framebuffer
[Created at bios.459]
Unique ID: rdCR.aX_EXjgoS_0
Hardware Class: framebuffer
Model: "(C) 1988-2010, AMD CAICOS"
Vendor: "(C) 1988-2010, AMD Technologies Inc."
Device: "CAICOS"
SubVendor: "AMD ATOMBIOS"
SubDevice:
Revision: "01.00"
Memory Size: 16 MB
Memory Range: 0xc0000000-0xc0ffffff (rw)
Mode 0x0300: 640x400 (+640), 8 bits
Mode 0x0301: 640x480 (+640), 8 bits
Mode 0x0303: 800x600 (+832), 8 bits
Mode 0x0305: 1024x768 (+1024), 8 bits
Mode 0x0307: 1280x1024 (+1280), 8 bits
Mode 0x0310: 640x480 (+1280), 15 bits
Mode 0x0311: 640x480 (+1280), 16 bits
Mode 0x0313: 800x600 (+1600), 15 bits
Mode 0x0314: 800x600 (+1600), 16 bits
Mode 0x0316: 1024x768 (+2048), 15 bits
Mode 0x0317: 1024x768 (+2048), 16 bits
Mode 0x0319: 1280x1024 (+2560), 15 bits
Mode 0x031a: 1280x1024 (+2560), 16 bits
Mode 0x030d: 320x200 (+640), 15 bits
Mode 0x030e: 320x200 (+640), 16 bits
Mode 0x0320: 320x200 (+1280), 24 bits
Mode 0x0393: 320x240 (+320), 8 bits
Mode 0x0395: 320x240 (+640), 16 bits
Mode 0x0396: 320x240 (+1280), 24 bits
Mode 0x0396: 320x240 (+1280), 24 bits
Mode 0x03b3: 512x384 (+512), 8 bits
Mode 0x03b5: 512x384 (+1024), 16 bits
Mode 0x03b6: 512x384 (+2048), 24 bits
Mode 0x03c3: 640x350 (+640), 8 bits
Mode 0x03c5: 640x350 (+1280), 16 bits
Mode 0x03c6: 640x350 (+2560), 24 bits
Mode 0x0333: 720x400 (+768), 8 bits
Mode 0x0335: 720x400 (+1472), 16 bits
Mode 0x0336: 720x400 (+2944), 24 bits
Mode 0x0353: 1152x864 (+1152), 8 bits
Mode 0x0355: 1152x864 (+2304), 16 bits
Mode 0x0356: 1152x864 (+4608), 24 bits
Mode 0x0363: 1280x960 (+1280), 8 bits
Mode 0x0365: 1280x960 (+2560), 16 bits
Mode 0x0366: 1280x960 (+5120), 24 bits
Mode 0x0321: 640x480 (+2560), 24 bits
Mode 0x0322: 800x600 (+3200), 24 bits
Mode 0x0323: 1024x768 (+4096), 24 bits
Mode 0x0324: 1280x1024 (+5120), 24 bits
Mode 0x0343: 1400x1050 (+1408), 8 bits
Mode 0x0345: 1400x1050 (+2816), 16 bits
Mode 0x0346: 1400x1050 (+5632), 24 bits
Mode 0x0373: 1600x1200 (+1600), 8 bits
Mode 0x0375: 1600x1200 (+3200), 16 bits
Mode 0x0376: 1600x1200 (+6400), 24 bits
Mode 0x0383: 1792x1344 (+1792), 8 bits
Mode 0x0385: 1792x1344 (+3584), 16 bits
Mode 0x0386: 1792x1344 (+7168), 24 bits
Mode 0x03d3: 1856x1392 (+1856), 8 bits
Mode 0x03d5: 1856x1392 (+3712), 16 bits
Mode 0x03d6: 1856x1392 (+7424), 24 bits
Mode 0x03e3: 1920x1440 (+1920), 8 bits
Mode 0x03e5: 1920x1440 (+3840), 16 bits
Mode 0x03e6: 1920x1440 (+7680), 24 bits
(END)
So now at last I understand (I had wondered about this) my graphics card supplies only the above vesa resolutions. This is why I see all of them being tested in the Xorg.0.log & I don't see 1920x1200.
I can at last put this one to bed.
Last edited by handy (2011-11-12 21:58:52)

Similar Messages

  • How do you input a equation in a mathematical format?

    How do you input a equation in a mathematical format like the equation above? Thanks!

    var radicand:Number = b*b-4*a*c;
    if(radicand>=0){
    var rad:Number = Math.sqrt(radicand);
    var x1:Number = (-b+rad)/(2*a);
    var x2:Number = (-b-rad)/(2*a);
    } else {
    rad= Math.sqrt(-radicand);
    var s1:String = (-b/(2*a))+"+i"+(rad/(2*a));
    var s2:String = (-b/(2*a))+"-i"+(rad/(2*a));

  • How do you capture real detail of sql errors in crystal viewer of CR2008

    How do you captue the further details of an sql error when running a report in Crystal 2008 viewer (within a software application)? I get a 'failed to retrieve data from database. Details: Database Vendor Code: -nnnn' error where nnn is a number like 210074. If I run the same .rpt directly thru crystal I get a failed to retrive rowset. Then after choosing ok on that message  I get another message with more details and this time (in this particular case) it happens to be 'Column XXXXX cannot be found in database or is not specified for query. Other times it could be something about excedding max length/precison, etc..
    We recently switched from using the RDC CR10 to this new .net viewer way. With the RDC it gave us the 'meaningful" message so we could tell what was wrong. Using this new method it does not. Is there a setting for showing more detail or some coding that would give me the real message detail using cr2008?
    Don't know if it matters but in this case it is a Progress database.

    Hello,
    Back in the RDC days we manually added a lot of code to capture the exception info. When CR was re-written in 9 we removed all custom "fixes" and error handling, we basically said any ANSII 92 standards rule apply. We just pass what the client gives us.
    What this means is we dynamically pass the exception from the client to CR and interpret what we can or we simply pass the error code to the error UI.
    What you can do is to use a try/catch block and create your own error table according to Progress Error codes to pass on a more meaningful message to your users.
    We won't due to the vast amount of errors possible and the vast amount of DB clients we would have to maintain. If the client doesn't pass meaningful info onto CR then we just pass it's error onto the the end user.
    Thank you
    Don

  • How  do you input notes?

    I'm trying to click the notes in to compose a song, and nothing's going in! How do you do it? I don't want to use the keyboardpiano note thingy!
    Message was edited by: Host

    politeness often get quick replies

  • How do you convert channels.dvb to channels.conf

    Title says is all. How do you convert the kaffeine file channels.dvb to a channels.conf file which can be used by mplayer.
    The winterhill transmitter in the UK is now totally digital, since the change over I've not been able to get BBC channels properly in mplayer. The receiver is a Freecom DVB-T USB. I've manually edited the uk-WinterHill file with the transmitter info to update it and used it to create a channels.conf file.
    scan /usr/share/dvb/dvb-t/uk-WinterHill > /home/~/channels.conf
    This picked up pretty much all the channels & I can tune into BBC channels but receive no picture. I've googled a bit and it seems other people have this problem too and they think it's a problem due to the increased power of the signal after the change over to digital. I've tried a variable attenuator which did no good.
    I've now found that kaffeine can scan for and find the BBC channels AND it can also play them back properly. So the theory about the signal strength being too much for the freecom sticks attenuator is probably a load of guff, at least in my case.
    The data for BBC One from the kaffeine channels.dvb is:-
    TV|BBC ONE|101(2)|102(eng),106(eng),|0|4168|4168|Terrestrial|802000|0|v|-1|-1|-1|-1|8|-1|-1|-1|45|105(16)(1)(1)(eng),||9018|-1|0|
    The same data from the mplayer/scan channels.conf is:-
    BBC ONE:802000000:INVERSION_AUTO:BANDWIDTH_8_MHZ:FEC_2_3:FEC_AUTO:QAM_64:TRANSMISSION_MODE_8K:GUARD_INTERVAL_1_32:HIERARCHY_NONE:101:102:4168
    I really don't want to keep kaffeine installed and any links or info that would help convert these channel data files would be greatly appreciated. Seems there are ways to convert them the other way round, but so far I've found nothing to do it fron channels.dvb to channels.conf
    Last edited by Nixie (2009-12-17 08:42:56)

    Do you want to capture a single frame of the video as a separate image? You can play the video full screen in QuickTime, pause it at the appropriate frame and take a screenshot using shift-command-3.
    Matt

  • [solved] [xorgserver 1.6.1] and the minimalist xorg.conf

    Hello new xorgserver users
    According to the recent xorgserver upgrade to 1.6.1 I've got blackscreen eachtime I started X. The only way to solve that problem (I'm driving a nvidia desktop) was to replace nvidia driver with xf86-video-nv and no use xorg.conf anymore (otherwise X crashes).
    But ctrl+alt+backspace miss me
    And I try to use a minimal xorg.conf with the only three lines :
    Section "ServerFlags"
    Option "DontZap" "false"
    EndSection
    X crashes on start, my Xorg.0.log :
    X.Org X Server 1.6.1
    Release Date: 2009-4-14
    X Protocol Version 11, Revision 0
    Build Operating System: Linux 2.6.29-ARCH i686
    Current Operating System: Linux archibald 2.6.29-ARCH #1 SMP PREEMPT Wed Apr 8 12:47:56 UTC 2009 i686
    Build Date: 15 April 2009 11:09:10AM
    Before reporting problems, check http://wiki.x.org
    to make sure that you have the latest version.
    Markers: (--) probed, (**) from config file, (==) default setting,
    (++) from command line, (!!) notice, (II) informational,
    (WW) warning, (EE) error, (NI) not implemented, (??) unknown.
    (==) Log file: "/var/log/Xorg.0.log", Time: Sun Apr 19 15:27:19 2009
    (==) Using config file: "/etc/X11/xorg.conf"
    Parse error on line 2 of section ServerFlags in file /etc/X11/xorg.conf
    Unexpected EOF. Missing EndSection keyword?
    (EE) Problem parsing the config file
    (EE) Error parsing the config file
    Fatal server error:
    no screens found
    Please consult the The X.Org Foundation support
    at http://wiki.x.org
    for help.
    Please also check the log file at "/var/log/Xorg.0.log" for additional information.
    (WW) xf86CloseConsole: KDSETMODE failed: Bad file descriptor
    (WW) xf86CloseConsole: VT_GETMODE failed: Bad file descriptor
    some archers use this minimal xorg.conf. Why can't I ?
    Last edited by frenchy (2009-04-19 15:51:35)

    Hy wonder
    The wiki says "ServerFlags"...
    Anyway, the ServerLayout give me (xorg.0.log) :
    Parse error on line 2 of section ServerLayout in file /etc/X11/xorg.conf
    Unexpected EOF. Missing EndSection keyword?
    (EE) Problem parsing the config file
    (EE) Error parsing the config file

  • [SOLVED] How to turn off monitor at CLI

    Hi
        I've been trying to find a way to turn off my monitor, but the solutions I've found were almost all for X environment.
        The best hint I've got was add following lines to /etc/rc.local:
    setterm -blank 1
    setterm -powersave on
    setterm -powerdown 1
    With that, monitor was suppose to blank screen after 1 minute and, after another 1 minute, shutdown. But it only blanks and nothing more. Without the first line: "setterm -blank 1", it just don't do anything.
        Does anybody has idea of what else should I try?
        Thanks in advance.
    Last edited by vinicius (2009-02-27 02:48:43)

    Hi, Nezmer
    Even with aliases, I would have to type von with monitor turned off.
    I've done a script, witch is not a perfect solution, but works fine.
    monitor_off:
    #!/bin/bash
    # Check if X is running or not, turn off monitor, #
    # wait for a key press and turn it on again. #
    grep_result_file=$PWD'/x_running'
    # Check if X is running.
    ps -e | grep -e "\bX\b" > $grep_result_file
    ps -e | grep -e "\bxorg\b" >> $grep_result_file
    ps -e | grep -e "\bxserver\b" >> $grep_result_file
    ## If you want to check result file, uncomment following lines.
    #echo "===== $grep_result_file - begin ====="
    #cat $grep_result_file
    #echo "===== $grep_result_file - end ====="
    if [ ! -s $grep_result_file ] || [[ $(tty) =~ tty ]] || [[ $(tty) =~ vc ]]; then
    echo 'Detected X not runnig or you are at console...'
    if [ $UID -ne 0 ]; then
    echo 'You need super user privileges to run this script at console.'
    echo 'Rerun as super user or start X and run from a terminal.'
    exit 0
    fi
    turn_off='vbetool dpms off'
    turn_on='vbetool dpms on'
    else
    echo 'Detected X running...'
    turn_off='xset dpms force off'
    fi
    echo 'Turning off monitor...'
    $turn_off
    echo 'Waiting for a key press...'
    read -n1 -s
    echo 'Turning on monitor...'
    $turn_on
    rm $grep_result_file
    echo 'Finished: monitor_off'
    It checks if X is running or not because at X environment vbetool was a little slow, at least for me.
    That's it!
    I hope this could be useful for someone else.
    Thank you for helping, guys.
    Vinícius
    P.S.: I'm setting this thread as solved, but any suggestion about the script is welcome.

  • [SOLVED]How do you mark post as being solved

    I am trying to follow the link
    https://wiki.archlinux.org/index.php/Forum_Etiquette
    But I am unable to mark a post that I made as SOLVED.(aka the title)
    Any inputs on this is appreciated.
    Regards,
    -Narahari
    Last edited by savithari (2013-03-15 13:37:56)

    It's done manually. You edit the first post, which allows you to change the thread title.
    wiki wrote:Finally, when a solution is found, mark your thread as solved by editing the first post and prepending the tag [SOLVED] to the title in the "Subject" field.
    Last edited by alphaniner (2013-03-15 13:29:53)

  • How do you CONCURRENTLY monitor while recording a voice track?

    I want to use Garageband for recording hypnosis sessions. I connect a Plantronics USB Mic headset to the computer and record into that (and can hear through it too), but want ALSO for a client to be able to hear through a SEPARATE set of headphones - plugged into the headphone jack (or somehow).
    Problem is that the headphone jack of the MAC is disabled by the use of the other USB headphone being used for the recording/monitoring.
    How can I get this to work? People MUST be doing this.
    Thank you!
    Martin
    Intel iMac 20"   Mac OS X (10.4.8)  

    You can't do it the way you want to, GB supports a single output device only.
    You'd have to use this Headphone Splitterplugged into the headphone jack on the Mac.

  • [SOLVED] how do you get ASCII in a terminal?

    http://www.youtube.com/watch?v=iJpSQonl … annel_page
    Time index: 1:34
    That guy has a moose above his hostname. I thought the only way to add anything was manipulating PS1, but that moose is above his hostname?
    Anyone know how to add something similar like that, maybe archs logo?
    Last edited by greenfish (2009-05-17 16:16:09)

    Ok, figured it out. Each of the "[..." actually needs to be prefixed with an (non-printable) ESCAPE character, ASCII code #27.
    Since they're non-printable they cannot be displayed in this forum. :-p
    I just wrote a little tool that writes a code-27 char in front of every "[" and that works.
    //File 'issue-make.c'
    //Create a usable 'issue' file (place it at /etc/issue) from various weirdly formatted text files.
    //It accepts [... or [[... or ^[... escape sequences.
    //By Jindur for bbs.archlinux.org. (Use/modify/hack/distribute/as you wish, just don't sell it.. as if lol).
    //Maybe you don't want single [ to be recognized as ESC sequence. Delete the code lines then i guess.
    #include <stdio.h>
    int main(int argc, char *argv[])
    char buf[256], buf2[256];
    int n, n2;
    FILE *f, *f2;
    int just_converted = 0;
    if (argc < 2)
    printf("This tool converts escape sequences ^[.. [[.. and [.. into real escape sequences.\n");
    printf("Enter filename of an ascii art file you wish to convert into a usable 'issue' file.\n");
    printf("This will overwrite an existing 'issue' file in the current directory.\n");
    printf("Example: issue-make my-cool-art.txt\n");
    return 1;
    f = fopen(argv[1], "r");
    if (!f)
    printf("Error: Could not open input file for reading.\n");
    return 2;
    f2 = fopen("issue", "w");
    if (!f2)
    printf("Error: Could not open 'issue' file for writing.\n");
    return 3;
    while (fgets(buf, 256, f))
    n = -1;
    n2 = -1;
    do {
    n++;
    n2++;
    //case 1: sequence ^[...
    if (buf[n] == '^' && buf[n + 1] == '[')
    n++;
    buf2[n2] = 27;
    just_converted = -1;
    //case 2/3: sequence [ or [[
    else if (buf[n] == '[' && !just_converted)
    if (buf[n + 1] == '[')
    buf2[n2] = 27;
    just_converted = -1;
    else
    buf2[n2] = 27;
    n2++;
    buf2[n2] = '[';
    just_converted = -1;
    else
    buf2[n2] = buf[n];
    just_converted = 0;
    } while (buf[n]);
    fprintf(f2, "%s", buf2);
    fclose (f2);
    fclose (f);
    return 0;
    Use 'gcc' to compile.
    Last edited by Jindur (2013-06-10 17:01:44)

  • [Solved] How to retrieve AUR login details...

    Hi.
    It's been a while since I've logged into the AUR, and now I can't remember my user-name login.
    I was able to reset my password through my e-mail address, but I haven't a clue what my login name was.
    How do I retrieve this info?
    Last edited by *nixer (2012-01-05 03:12:49)

    Ask on the mailing list, they should be able to tell you what login belongs to your e-mail address.
    http://mailman.archlinux.org/pipermail/ … 17278.html

  • How do you input an ip address on a deskjet 3050a j611 series. it was working fine

    I have an deskjet 3050A all in one series J611.  It has been working great.  I had printed some documents earlier with no problem.  No it is saying can not connect due to not being able to connect to wireless router. 

    The first round of troubleshooting would probably be to reset the whole network. 
    1.  pull the power from the back of the printer and from the wall
    2.  shut down computer
    3.  reset the router by unplugging it for 30 sec
    4.  wait until all the lights on the router are back on
    5.  turn on the computer and let it get connected to the router
    6.  reconnect the printer--it should power up on its own
    This routine usually fixes a lot of issues.  Let me know if it works or doesn't.
    SandyBob
    If I have solved your issue, please feel free to provide kudos and make sure you mark this thread as solution provided!
    Although I work for HP, my posts and replies are my own opinion and not those of HP
    1-800-474-6836

  • How do you input notes onto a musical score in Garageband?

    Hello, after a failed attempt at doing it myself i wondered if anybody else knew how to input notes onto a musical score on Garageband. When I say input notes I mean for example if the stave is there at the bottom is there a way which I can simply add a note (á la Sibelius) whether it be via a keyboard shortcut (cmd + n on Sibelius though obviously in Garageband this creates a new project). Any help would be appreciated, thanks.

    http://www.bulletsandbones.com/GB/GBFAQ.html#addmidinotes
    (Let the page FULLY load. The link to your answer is at the top of your screen)

  • How do you input guitar in GarageBand on iMac

    I've tried to input my guitar into GarageBand on my iMac, but have not been very successful. How can i do this?  I have iRig, and plug it into the back port, but that doesn't work.  I also have a USB Guitar Link that works with Amplitube 3 that is on the same iMac, but doesn't work with GarageBand.  I've tried changing the input choices under preferences, but no luck.  With the USB Guitar Link, the Audio Preferences menu gives me the choice of "USB Audio CODEC" and I choose that...but, that doesn't work either.  The ONLY thing that has worked is the "Built-In Microphone"...but that records everything in the room...and its of very low quality.  Any help is appreciated.

    tlcrogers wrote:
    I plug my guitar in to the audio output (headphones) jack.  I then go into system settings and chagne the jack to be an input instead of an output.
    the iMac has separate input and output jacks, it does not work like your laptop's jack

  • How do you edit payment details?

    Ironically, technology firms often have the worst in-house technology. All I want to do is edit my payment options on-line, so I can renew. However, the 'system' will not do this. I cannot see an email address anywhere to ask this. I phoned the Maidenhead office to see if I could get through to customer service, and 'option 2 mode' on the phone is not working, so no luck there. I'm glad the product I buy seems to (mostly) work!
    Please help me with this simple query or give me the contact details of someone who can.
    Thanks!

    Well, Reset Prefs is also a shotgun approach and while it might fix the problem, you end up having to restore workspaces and custom settings like Layer thumbnails sizes and workspaces.
    Reset will clear a particular dialog's settings and unlike Prefs clears just that problem dialog.
    You can also Reset a particular Tool if it gives you problems. Just right-click on its' icon in the Options bar and you will see the Reset menu.
    Get familiar with Preferences > Sync Settings. On my PC side, I upload everything. If I have to Reset Prefs, I go into Sync Settings, download and it's very close to what it was before with my custom presets.
    Here's a cool bonus: My Windows 7 Dell has enough vram to use 3D, which is what you need if your Filter > Render > Tree/Flame/Frame
    is grayed out, this function will activate them.
    My Mid 2009 Macbook however does not have enough vram for 3D and by itself cannot activate the renders. However if I go into Sync Settings and download what I uploaded from the Dell side, it actually activates those renders.
    Anyway, awful glad I could help you get it fixed. Have a great evening
    Gene

Maybe you are looking for