Splitting large sql database files
Hi
I have large sap sql server data files having size in 40 GB/ 3 data files. Is there any way to split those data files into smaller size .
Thanks and Regards
Jijeesh
Hi Jijesh,
There is a way of splitting files without hampering performance, I have done this on a production system without loss of performance.
All data files in MSSQL belong to a FILE Group, suppose there are 4 datafiles in your database each of 4GB, assume all files belong to the same filegroup.
file1.mdf
file2.mdf
file3.mdf
file4.mdf
and you want to split each into 2 GB datafiles here is what you do.
(Before starting Take Complete Offline backup of your database by stopping SAP and performing a full offline backup of all data and log files.
This operation technically could be performed with SAP Up and running , however its safer to keep database idle when doing a reorg, so stop SAP instance and do the following.)
1.We select the file file1.mdf
2.Add 2 new files to the same filegroup as file1.mdf and name them as file1a.mdf and file1b.mdf of size 2GB each.
3.RESTRICT growth on all rest files (file2.mdf, file3.mdf and file4.mdf) by unchecking the Auto grow option.
4. Open the SQL Analyser and give the command:
<b> USE '<SAP SID>' ;
GO
DBCC SHRINKFILE 'file1.mdf' EMPTYFILE;
GO
</b>
The above mentioned commands will empty the contents of 'file1.mdf' and restribute the data to other files in its FILEGROUP.
Now we have restricted growth by TURNING OFF growth on 'file2.mdf' , file3.mdf and file4.mdf.
The command will distribute data to the new files created by us file1a.mdf and file1b.mdf.
When the command has completed, you can safely remove the file1.mdf
5. Perform steps 1-4 for all the remaining files file2.mdf and file3.mdf and file4.mdf.
After doing the above operation run a CHECK DB using DBCC CHECKDB , this will ensure that your database integrity is checked and everything is okay.
Now run Update statistics to ensure that your performance will not be hampered.
Regards,
Siddhesh
Similar Messages
-
How to attach/restore the sql database/file extension (*.File)
I took backup the database long back ago. Now I
am unable to attach/restore the sql database/file extension (*.File). I
have the file type called (*.File) which i need to attach/restore. Can anyone tell me what's are the steps
taken to do this?
Thank in advance.Database
name-->Right Click-->Tasks-->Restore-->Database-->Select "Device" radio button-->Click device Path button-->Click "ADD" button --> I tried with "All Files(*)"--> then i selected my database-->ok-->ok-->ok -->
then i am getting below error.
I am unable to post image.
"Restore of database 'Databasename' failed. (microsoft.sqlserver.management.relationalenginetasks)"
Additional Information:
System.Data.SqlClient.SqlError: The database was backed up on a server running version 8.00.0194. That version is incompatible with this server, which is running version 11.00.2100. Either
restore the database on a server that supports the backup, or use a backup that is compatible with this server. (Microsoft.SqlServer.SmoExtended).
Can
you please help me to solve this error.
Thanks,
Laxmi. -
Hi All,
Request you to give some information and your valuable suggestion in deciding the best procedure to Split Ms SQL 2000 server Database files.
by default SQL was in auto-extandable mode and its growing, now we would like to split them in to few data files and planning to distribute the load across different spindles in RAID arrays to get max I/O.
what will be the idel/best practice to split ? is that by SQL script (Query analyzer) or 3rd party tools to split database files in to multiple files?
how can we confirm the consistency/integrity of DB after splitting?
type of query to run for spliting and mapping DB files after moving to different locations in SAN?
hope that you'll give some valuable information on this,
by
HarikaThe easiest way to do this is to create a homogeneous system copy using R3load - means, you unload your system, delete it and reinstall it with the export you created.
On installation time you can enter the number of files and their sizes.
A common best practice approach is to use as much data files as you have physical disks.
Markus -
Basic question on SQL database file structure
I'm learning to use JDBC / databases . I'm using JBuilder3 professional and HSQL (aka Hypersonic). I've
been following the Java Tutorial on learning databases.
So far I've sucessfully
set up the drivers / URL etc
connected to the URL
created a database (COFFEEBREAK , as per the tutorial)
added tables
added data
queried the database
------------- with the tutorial , so far so good. --------
My only database experience prior to this was using Microsoft access, where tables, data, queries etc are all
stored in one file <myfile.mdb>
But with the example I am following, not one but 4 files have been created:
COFFEEBREAK.data
COFFEEBREAK.backup
COFFEEBREAK.properties
COFFEEBREAK.script
the COFFEEBREAK.data file, looks like this (one line) in a text editor:
padding="0" cell
After I added data , using SQL ,I was expecting this file to grow ("as data rows were added") , but no change at all - still just one line.
The file that DOES change is the COFFEEBREAK.script file. The SQL instructions get added to the end of the file eg :
<various lines>
INSERT INTO FILES VALUES('.\COFFEEBREAK.properties','COFFEEBREAK.properties')
INSERT INTO FILES VALUES('.\COFFEEBREAK.data','COFFEEBREAK.data')
INSERT INTO COFFEES VALUES('French_Roast',49,8.99,0,0)
INSERT INTO COFFEES VALUES('Espresso',150,9.99,0,0)
INSERT INTO COFFEES VALUES('Colombian_Decaf',101,8.99,0,0)
INSERT INTO COFFEES VALUES('French_Roast_Decaf',49,9.99,0,0)
I had been expecting a single file ( a la Microsoft access) , with the data in it and not readable in ascii form.
Am I misunderstanding the structure of SQL / DBMS files , or is this just a pecularity of Hypersonic ??
How do I ship a finished Java application where a database can be interrogated - will the *.script file have to be shipped too? If so , all the data is there , so how is the DBMS password protected?
I'm confused ! Thanks in advance
M.Thanks again for the response trejkaz.
If the database were encrypted, you could just use an ordinary SQL query tool to dump >> all the records to a text file, which would be just as readable as the .script file!I had in mind if the user name was "Bill" with his (self selected ) password "hisPass" then the encryption key would be derived from a mangled version of those two bits of data (eg qazBillwsxhisPass or whatever). As long as the mangling mechanism was secret hopefully that would stop access through regular SQL query tools.
You're giving them the software to read the database anyway .. what's the difference if they >> read the info from the flat file, compared to reading it through a program? Through the program the data is view only (maybe with a limited subset of the fields for printing). With a flat file a potential competitor gets a jump start by having an electronic copy to bootleg.
Two examples occurred to me, one simple & one closer to what I'm trying to achieve
1) An example of a multiple choice database for students where the computer tells them how they have done - couldnt the students just look up the answers beforehand in the script file ?
2) (hypothetical example) Suppose I'm an expert on gardening & I want to set up 'electronic garden advice' Inc. I prepare a database, say about 3000 records. A typical record may have a layout based on :
question
question ref. number
region
garden style ref.
plant type main
plant type sub
disease-prone?
commercial/non commercial
criticality weighting %
substitute plants
recommended treatments
preferred season
etc . Most of us can imagine more possible fields
Anyone can pay the $ , download the database & set themselves up as an expert, maybe extracting sub sets of the data, depending on their customers needs.
But any printouts might only have say the question & question number so the whole database woulnd't be printed to a text file from my Java app . (The non printed fields are used within the Java app say for cross referencing alternatives etc) so the whole database couldn't be downloaded via a print routine to a text file.
At the moment my options seem to be:
1) different SQL database , where the data isnt stored in ascii readable form
2) Just use a csv text file instead of dbms & write my own encryption
3) Encrypt the HSQL script file, have the java app decrypt it to a temp scratch file & load that way . (the problem there seems that in a multitasking OS, the user just copies the scratch file)
4) somehow do the decryption after loading from an encrypted script file but before presenting the data on screen.
5) any other ideas you can recommend...?!
Apologies for the long response, my thoughts on this are still evolving
Thanks again
Mike2z -
How to locate SQL database files
Over the years I've used and discarded any number of applications, many of which created SQL databases. I'm trying to clean out various odds and ends and would like to know how to search for anything SQL. I know, for instance, that mail.app uses SQL but I have no idea where to find the database (not that I want to delete Mail!).
I'm comfortable using Terminal if that's the way to go.
Any tips on locating hidden SQL databases would really be appreciated.
Thanks!Similar to when users delete mailbox content, moving user mailboxes doesn't reduce the size of the edb file. Instead, the database simply gains "white space" which will be filled in before the edb grows any further. Once your users
are vacated, you can remove the databases from AD per the cmdlet mentioned above. as a safety measure, that cmdlet doesn't delete the actual edb files, but once you've run it, yes, you just go and press the delete key on the files. Obviously "ctrl+a"
means all files, which isn't what we're talking about. Just the database and associated log files.
Mike Crowley | MVP
My Blog --
Baseline Technologies -
How to attach sql database file to creating .exe setup in c#
I create a Setup file and put data folder, in data folder i put database file , but the problem is i have to mannualy attach database file in sql server 2005. I have to automatically attach database file when installing setup file.
Hi
bhagvad,
Welcome to MSDN Forums!
We can implement auto attach database just use the connection string following.
1)
Add a folder named ”DB” to the project, and copy the database file into it after you detach it from your sql server.
2)
Change your connection string like this:
<connectionStrings>
<add name="WindowsFormsApplication1.Properties.Settings.BabakConnectionString"
connectionString="Data Source=.;AttachDbFilename=|DataDirectory|\DB\Babak.mdf;Initial Catalog=test;Integrated Security=True"
providerName="System.Data.SqlClient" />
</connectionStrings>
3)
After these, the database file will auto attach to the sql server when you run your application, and the database name is “test”, you can open the sql server
management studio to find it.
In addition, you can find the connection in the app.config file through the solution explorer in vs2010. And you also can find this file under your project. With
these information, you can find it and modify it.
If there’s anything unclear, please feel free to let me know.
Best wishes,
Mike
Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually
answer your question. This can be beneficial to the others community members reading the thread.
when i try this i have an error from the computer. in my computer just microsoft server 2005 installed. and i need to setup my project.
<add name="TelekomEthernetApp.Properties.Settings.TelekomEthernetConnectionString"
connectionString="Data Source=.;AttachDbFilename=|DataDirectory|\DB\TelekomEthernet.mdf;Initial Catalog=test;Integrated security=true" providerName="System.Data.SqlClient"/>
what is the problem?
http://img31.imageshack.us/img31/2305/asdet.png -
How To split large message using File Adapter
Hello everyone,
Here is my scenario FTP > XI > BI.
I got 2 questions:
(1) I am getting large message around 70 MB file.
How we can split the message into multiple files 10 MB each before processing the file into XI?
(2) Is there is any way we can find out size of file which is on FTP without contacting FTP admin?
through XI before processing the file?
Thanks
Vickhi vick,
check the blog
Zip or Unzip your Payload with the new PayloadZipBean module of the XI Adapter Framework
Working with the PayloadZipBean module of the XI Adapter Framework
SAP XI acting as a (huge) file mover
The specified item was not found.
Managing bulky flat messages with SAP XI (tunneling once again) - UPDATED
The specified item was not found.
regards
kummari -
Splitting large imported video files
Hello,
I have a number of old videos that were recorded on an old VHS camera. Thanks to previous help from this forum I can now successfully import them into FCE4
Is it possible to import the whole 3hr VHS tape into FCE4 & then split the large video file into smaller files using FCE4 ? or do I need another program to do this – if so please suggest a suitable program.
Thanks
JohnHutchinsonHelo Tom,
Thanks for your reply.
Since there are many “episodes” on the three hour VHS tapes I don’t really want to keep stopping the import after each “episode” I’m looking for a quick way to import the video file. I was hoping I could import the full three hours in one go, then break the very large media file into the various “episodes” using either the scrubber bar in FCE4 or something similar in another program. Some of the material on the VHS tapes in not worth keeping, I was hoping to delete the sections no longer required thus saving valuable disc space.
Regards
John Hutchinson -
How to stop BDB from Mapping Database Files?
We have a problem where the physical memory on Windows (NT Kernel 6 and up, i.e. Windows 7, 2008R2, etc.) gets maxed out after some time when running our application. On an 8GB machine, if you look at our process loading BDB, its only around 1GB. But, when looking at the memory using RAMMAP, you can see that the BDB database files (not the shared region files) are being mapped into memory and that is where most of the memory consumption is taking place. I wouldn't care normally, as memory mapping can have performance and usability benefits. But the results are the system comes to a screeching halt. This happens when we are inserting results in high order, e.g. 10s of millions of records in a short time frame.
I would attach a picture to this post, but for some reason the insert image is greyed out.
Environment open flags: DB_CREATE | DB_INIT_LOCK | DB_INIT_LOG | DB_INIT_TXN | DB_INIT_MPOOL | DB_THREAD | DB_LOCKDOWN | DB_RECOVER
Database open flags: DB_CREATE | DB_AUTO_COMMITAn update for the community
Cause
We opened a support request (SR) to work with Oracle on the matter. The conclusion we came to was that the main reason for the memory consumption was the Windows System Cache. (For reference, see this http://support.microsoft.com/kb/976618) When opening files in buffered mode, the equivalent of calling CreateFile without specifying FILE_FLAG_NO_BUFFERING, all I/O to a file goes through the Windows System Cache. The larger the database file, the more memory is used to back it. This is not the same as memory mapped files, of which Berkeley will use for the region files (i.e. the environment.) Those also use memory, but because they are bounded in size, will not cause an issue (e.g. need a bigger environment, just add more memory.) The obvious reason to use the cache is for performance optimizations, particularly in read-heavy workloads.
The drawback, however, is that when there is a significant amount of I/O in a short amount of time, that cache can get really full and can result in the physical memory being close to 100% used. This has negative affects on the entire system.
Time is important, because Windows needs time to transition active pages to standby pages which decreases the amount of physical memory. What we found is that when our DB was installed on FLASH disk, we could generate a lot more I/O and our tests could run in a fraction of the time, but the memory would get close to 100%. If we ran those same tests on slower disk, while the result was the same, i.e. inserted 10 million records into the data, the time takes a lot long and the memory utilization does not approach even close to 100%. Note that we also see the memory consumption happen when we utilize the hotbackup in the BDB library. The reason for this is obvious: In a short amount of time we are reading the entire BDB database file which makes Windows utilize the system cache for it. Total amount of memory might be a factor as well. On a system with 16GB of memory, even with FLASH disk, we had a hard time reproducing the issue where the memory climbs.
There is no Windows API that allows an application to control how much system cache is reserved or usable or maximum for an individual file. Therefore, BDB does not have fine grained control of this behavior on an individual file basis. BDB can only turn on or off buffering in total for a given file.
Workaround
In Berkeley, you can turn off buffered I/O in Windows by specifying the DB_DIRECT_DB flag to the environment. This is the equivalent of calling CreateFile with specifying FILE_FLAG_NO_BUFFERING. All I/O goes straight to the disk instead of memory and all I/O must be aligned to a multiple of the underlying disk sector size. (NTFS sector size is generally 512 or 4096 bytes and normal BDB page sizes are generally multiples of that so for most this shouldn't be a concern, but know that Berkeley will test that page size to ensure it is compatible and if not it will silently disable DB_DIRECT_DB.) What we found in our testing is that using the DB_DIRECT_DB flag had too much of a negative affect on performance with anything but FLASH disk and therefore can not use it. We may consider it acceptable for FLASH environments where we generate significant I/O in short time periods. We could not reproduce the memory affect when the database was hosted on a SAN disk running 15K SAS which is more typical and therefore are closing the SR.
However, Windows does have an API that controls the total system wide amount of system cache space to use and we may experiment with this setting. Please see this http://support.microsoft.com/kb/976618 We are also going to experiment with using multiple database partitions so that Berkeley spreads the load to those other files possibly giving the system cache time to move active pages to standby. -
when attempting to restore a backup file to a SQL database file the restore failed and the error message referenced that the backup was from a "different" database. Trying to restore a damaged database file created in the same version of
SQL 2005. Any suggestions that would "allow" the restore to work are appreciated.the error message referenced that the backup was from a "different" database.
When you try to restore to a different database, then you have to use the option "Overwrite existing database (WITH REPLACE)", otherwise you get this error message; it's a kind of protection, see also
http://social.msdn.microsoft.com/Forums/en-US/6eb5b1aa-511e-44d2-b677-b2ba3303c84c/datavabse-restore-error?forum=sqlsearch
Olaf Helper
[ Blog] [ Xing] [ MVP] -
Storing WORD or PDF files in SQL database
Can WORD document or PDF documents be stored in SQL database?
If so, how? I would like an example (if possible).
I do realize that I can store a reference pointer that points
to a physical file on the server, but I don't want to do
that.You should do a little research on using BLOB data types.
Repository engine provides interfaces to handle properties that
are binary large objects (BLOBs) and large text fields. BLOBs are
properties that have values containing text or image data that can
be in excess of 64 kilobytes (KB). You can use BLOBs to perform
database operations that require you to work with large segments of
data at a time.
Phil -
Split database file into multi files on different drives.
Hi,
I Have a large database file .mdf that eating my drive disk space. So i have installed another disk. Now I have 2 drives (other than OS drive) one has the .mdf file and almost full and the new drive which is empty.
I need to know How to split the .mdf into another file located on the new drive and keeping the original .mdf file not growing anymore ?I know how to split an access file, but never done this with MDF, so if you know what connection string, then you should be able to change this code. However this will ask the user for location of file to be split, it will split in same folder then you can
move the splits to another folder.
This will allow you to choose how many records you want to split by, You can ignore the watchfolder part, or you can use that to automatically copy the split files, then add a delete from orginal source.
Anyway here is the code and I hope it can help with MDF files.
Imports Microsoft.VisualBasic, System, System.Diagnostics, System.IO, System.Windows.Forms
Imports System.Data.OleDb, System.Data, System.Data.SqlClient, System.Data.Odbc, System.Threading
Imports Microsoft.Office.Interop, System.Runtime.InteropServices, Microsoft.Office.Interop.Outlook
Public Class Form1
Dim sConStr As String = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source="
Dim aConStr, CurPr, WFolder, VFolder, SFolder, ZFolder, eFile, tFile, FileName As String
Dim TableName, aConStr2, II, IV As String
Dim Recs, Maxd, i, StartRec, EndRec, TRec, iii As Integer
Dim Parts As Decimal
Private Sub Button1_Click(sender As Object, e As EventArgs) Handles Split_Recs.Click
WFolder = txtBxPath.Text + "\"
FileName = txtBxAccess.Text
TableName = txtBxTable.Text
aConStr = WFolder + FileName
Recs = txtBxTotRecs.Text
Maxd = txtBxMax.Text
Parts = Recs / Maxd
i = 0
StartRec = i
EndRec = Maxd
Do While i < Parts
Dim FirstRec As String = StartRec + 1
Dim LastRec As String = EndRec
Dim TotRec As String = EndRec
TRec = EndRec - StartRec
TotRec = TRec
II = i + 1
IO.File.Copy("\\pw_prod\Watch_Input\Glenn-PTZ\db1.mdb", WFolder + txtBxJobNum.Text + "Pt_" + II + ".mdb")
TextBox3.AppendText(" Between " + FirstRec + " and " + LastRec + vbCrLf)
txtBxProgress.AppendText("Part: " + II + " Start " + FirstRec + " End " + LastRec + " Total: " + TotRec + vbCrLf)
TextBox5.Text = " Between " + FirstRec + " and " + LastRec
StartRec = StartRec + Maxd
EndRec = EndRec + Maxd
'Creates Queries in Main File
Dim con As OleDbConnection = New OleDbConnection(sConStr + aConStr)
Dim cmd As OleDbCommand = New OleDbCommand("CREATE PROC Variable" + II + " AS SELECT * FROM " + TableName + " where " + txtBxSort.Text + TextBox5.Text + " Order by " + txtBxSort.Text, con)
con.Open()
cmd.ExecuteNonQuery()
con.Close()
'Export each new query as new access file
Dim DevComm1Conn_1 As New System.Data.OleDb.OleDbConnection(sConStr + WFolder + txtBxJobNum.Text + "Pt_" + II + ".mdb")
DevComm1Conn_1.Open()
Dim DevComm1 As New System.Data.OleDb.OleDbCommand( _
"SELECT * INTO [Variable" + II + "] FROM [MS Access;DATABASE=" + WFolder + FileName + ";].Variable" + II, DevComm1Conn_1)
DevComm1.ExecuteNonQuery()
DevComm1Conn_1.Close()
i = i + 1
IV = i - 1
If IV = 0 Then
IV = 1
End If
If EndRec > Recs Then
EndRec = Recs
End If
TextBox5.Text = String.Empty
If txtBxWatchCopy.Text = "Yes" Then
IO.File.Copy(WFolder + txtBxJobNum.Text + "Pt_" + II + ".mdb", txtBxWatch.Text + "\" + txtBxJobNum.Text + "Pt_" + II + ".mdb")
End If
Do While IO.File.Exists(txtBxWatch.Text + "\" + txtBxJobNum.Text + "Pt_" + II + ".mdb")
Threading.Thread.Sleep(50)
Loop
Loop
txtBxProgress.AppendText("Total Parts: " + II + vbCrLf)
MsgBox("Match all counts to ticket. If not match talk to CSR")
End Sub
End Class -
Hi,
I have a problem to transfer a XML file content to a MS SQL database by a given/fixed stored procedure. I'm able to transfer the content of the file by using following method ...
hstmt = DBPrepareSQL (hdbc, EXEC usp_InsertReport '<Report> ..... </Report>');
resCode = DBExecutePreparedSQL (hstmt);
resCode = DBClosePreparedSQL (hstmt);
... but in this case I'm not able to fetch the return value of the stored procedure!
I have tried to follow the example of the stored procedure in the help documentation (DBPrepareSQL) but I miss a datatype for xml?!?
Any idea how to solve my problem?
KR Cake
Solved!
Go to Solution.After some additional trials I found a solution by calling the stored procedure in this way
DBSetAttributeDefault (hdbc, ATTR_DB_COMMAND_TYPE, DB_COMMAND_STORED_PROC);
DBPrepareSQL (hdbc, "usp_InsertReport");
DBCreateParamInt (hstmt, "", DB_PARAM_RETURN_VALUE, -1);
DBCreateParamChar (hstmt, "XMLCONTENT", DB_PARAM_INPUT, sz_Buffer, (int) strlen(sz_Buffer) + 1 );
DBExecutePreparedSQL (hstmt);
DBClosePreparedSQL (hstmt);
DBGetParamInt (hstmt, 1, &s32_TestId);
where sz_Buffer is my xml file content and s32_TestID the return value of the stored procdure (usp_InsertReport(@XMLCONTENT XML))
Now I face the problem, that DBCreateParamChar limits the buffer size to 8000 Bytes.
Any idea to by-pass this shortage?? -
I have the Easy System Cleaner. It worked through Internet Explorer. But when it comes to Mozilla Foxfire. It stops.
The message is:
error executing sql. Error 26. File opened that is not a database file "select[sql] from sqlite_master where [type]="table' and lower (name)='mos_cookies' " File is encrypted or is not a database.
What should I do. And where do I look to solve this problem.
I need to use my Easy System Cleaner...I paid for it!
[email protected]How do I fix this error
-
Unable to generate SQL trace file on a 10.2.0.1.0 database
Hello,
I am unable to generate SQL trace files on a 10.2.0.1.0 database (OS is Win2003 server 64 bits).
First I tried the way I used to do it on older databases (8i and 9i) :
execute dbms_system.set_sql_trace_in_session(sid, serial#, true);
I got no error, but no file was created in the user dump dest directory.
Then I've been searching and I've tried other things :
- I changed the user_dump_dest parameter
- I changed the tracefiles_public parameter value to "true", by modifying the init.ora file
- I tried another package :
exec dbms_monitor.session_trace_enable(139)
Nothing worked, I can't create any trace file...
Does anyone have an idea about this issue ?
thank you for you help !
AntoineHello,
thank you all for replying.
I have 2 instances on this machine, and I've just realized that with the other one I have no problem to generate sql trace files as usual.
But why the hell is it impossible on the first instance ? What difference between the 2 instances can explain this ?
This is pretty weird...
Otherwise I am experiencing serious performance problems on the instance where I can't creat trace files, there must be something wrong with it, but I can't figure it out
regards,
Antoine
Maybe you are looking for
-
How many attempts do you get before you have an issue in settings under restrictions while trying to turn off in-app purchases? It's asking me for a passcode but I don't recall ever having a passcode.
-
I've been using iChat to videoconference with my parents in France for 6 years now. iChat has usually run smoothly (except for occasional problems with the Internet provider), but for the past couple of weeks, when I call, the line says "Starting vid
-
Jdev3.1/ Validator problem or bug???
Hi, On one of the entity attributes, i have added a listValidator. this validates against a list of values from a select statement. The problem is that while testing the appl module, once i enter a correct value in the concerned field and tab out. no
-
Solution for clearing IE-11 corrupted internet options deleting browser history.
Microsoft detected with a PC message, knowing my Internet Explorer is corrupted, and corrected it with some success in the past. I didn't uninstall the suspect program in time, to keep "clear" of the problem internet options page looking (too"refined
-
Hello everyone, I wonder how to set up a printer filter to pdf file on JDS/Linux? E.g OpenSuse 10 comes ready preconfigured with a printer filter to pdf file. Not sure if it may be Ghostscript that is used for this setup. Has any of you set up such a