User forums > Help
Slow perfomance on linux (Ubuntu Dapper)
sethjackson:
Are you using the proprietary nVidia driver???
I have OpenBSD + X.Org 6.9 + GeForce2 MX/MX 400, and it works fine. :)
I have't compiled C::B on it yet because the wx version is at 2.4.2 for OpenBSD.....
sque:
Everything else works fine for my pc... at least that it looks like.
--- Quote from: sethjackson on August 29, 2006, 06:57:05 pm ---Are you using the proprietary nVidia driver???
--- End quote ---
Yes I am running on proprietary drivers 8762
part of my /var/log/Xorg.0.log
--- Code: ---sque@ubuntu:~$ cat /var/log/Xorg.0.log | grep NVIDIA
(**) | |-->Device "NVIDIA Corporation NV36 [GeForce FX 5700]"
(II) Module glx: vendor="NVIDIA Corporation"
(II) Module nvidia: vendor="NVIDIA Corporation"
(II) NVIDIA X Driver 1.0-8762 Mon May 15 13:09:21 PDT 2006
(II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
(--) Chipset NVIDIA GPU found
(**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32
(==) NVIDIA(0): RGB weight 888
(==) NVIDIA(0): Default visual is TrueColor
(==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
(**) NVIDIA(0): Option "RenderAccel" "true"
(**) NVIDIA(0): Enabling RENDER acceleration
(II) NVIDIA(0): NVIDIA GPU GeForce FX 5700 at PCI:1:0:0
(--) NVIDIA(0): VideoRAM: 131072 kBytes
(--) NVIDIA(0): VideoBIOS: 04.36.20.19.06
(II) NVIDIA(0): Detected AGP rate: 8X
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(--) NVIDIA(0): Connected display device(s) on GeForce FX 5700 at PCI:1:0:0:
(--) NVIDIA(0): Samsung SyncMaster (CRT-1)
(--) NVIDIA(0): Samsung SyncMaster (CRT-1): 400.0 MHz maximum pixel clock
(II) NVIDIA(0): Assigned Display Device: CRT-1
(II) NVIDIA(0): Validated modes:
(II) NVIDIA(0): "1280x1024"
(II) NVIDIA(0): "1024x768"
(II) NVIDIA(0): "800x600"
(II) NVIDIA(0): "640x480"
(II) NVIDIA(0): Virtual screen size determined to be 1280 x 1024
(--) NVIDIA(0): DPI set to (95, 96); computed from "UseEdidDpi" X config option
(II) NVIDIA(0): Setting mode "1280x1024"
(II) NVIDIA(0): NVIDIA 3D Acceleration Architecture Initialized
(II) NVIDIA(0): Using the NVIDIA 2D acceleration architecture
(==) NVIDIA(0): Backing store disabled
(==) NVIDIA(0): Silken mouse enabled
(**) NVIDIA(0): DPMS enabled
(II) XINPUT: Adding extended input device "NVIDIA Event Handler" (type: Other)
--- End code ---
glxgears works with 5000 fps. I can run Xgl with no problem of speed and everything works perfect... I can see video...
Game_Ender:
I am not at my Linux machine right now, but lets not just guess its graphics driver. I use ATI and I do have the problem. I think the best thing would be to do as sque has already started, compare sysprof profiles of the same activity. For example scrolling through a very long project file (everyone would do the same one from the CB project), or just continuously typing for for 1 to 2 minutes. Then we can see where the people with the slowdown spend there time.
I found the biggest chunck of time was spent in pango invoked by scintilla. Sque, I suggest you just give sysprof a try its not to hard to figure out how it works and would probably a good learning experience. Sysprof shows you a tree view of each running process and how much of its time was spent in each branch (each branch being a function with its subbranches being the functions it calls). You have to find the one which is codeblocks and go from there.
sethjackson:
Has anyone tried SciTE on a *nix box, and compared it to C::B on the same *nix box?
I may try to compile Scintilla and SciTE soon to see if it is slow....
takeshimiya:
--- Quote from: sethjackson on August 30, 2006, 03:29:37 am ---Has anyone tried SciTE on a *nix box, and compared it to C::B on the same *nix box?
I may try to compile Scintilla and SciTE soon to see if it is slow....
--- End quote ---
And let's not forget wyoEditor too (since it may be a wxScintilla issue).
http://wyoguide.sourceforge.net/downloads.html
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version