IDS database is too big
Hi all,
I see in Security monitor:
IPS database file: 119% (Usuage)
idsmdc.db (28.94%)
idsmdc.log (90.45%)
Now, I want to ask, If I delete file idsmdc.log, Cisco VMS can run well or not?
How to decrease the IDS database?
Thank you very much.
Hi, if you delete the idsmdc.log file, it will break the IPSMC and SecMon.
Please open a TAC case to resolve this issue.
Thank you.
Edward
Similar Messages
-
Alert log file of the database is too big
Dear Experts?
Let me update you that we are in process of doing an R12.1 upgrade ? And Our Current Instance is running on the 11.5.10 with 9.2.0.6 ?
We have the below challenges before going for an database upgrade ?
We have observed that the customer database alert_SID.log (9.2.0.6) size is 2.5GB ? how to purge this file ? Please advise
Please also note that our Instance is running on the Oracle Enterprise Linux 4 update 8
Regards
Mohammed.user977334 wrote:
Dear Experts?
Let me update you that we are in process of doing an R12.1 upgrade ? And Our Current Instance is running on the 11.5.10 with 9.2.0.6 ?
We have the below challenges before going for an database upgrade ?
We have observed that the customer database alert_SID.log (9.2.0.6) size is 2.5GB ? how to purge this file ? Please advise
Please also note that our Instance is running on the Oracle Enterprise Linux 4 update 8
Regards
Mohammed.Rename the alert logfile. once you rename the logfile another alert log file will create and being populated with time, and later you can delete it old alert log file . it doesnot harm your database.
--neeraj -
What are solutions for a way-too-big database?
Hi guys!
I'm a software developer and not very good in database designing. One day, I was asked something like this :
"For example, there is a company with a web application. One day, the database for that application is way too big, caused performance issues and others, what is the solution for that application database?"
At first, I thought that was about using multiple database with single app. But I don't know if I was right.
I want to ask that what are the solutions? If it's "multiple database" then what should I do? Using two connection to 2 database simutaneously?
I appreciate any replies. Thanks!847617 wrote:
Thanks Lubiez Jean-Val... for your links.
I've got some more advices like :
- "transferring workload to another database using different techniques to copy the data from original db"
- "redesign of the database"
So that means we use 2 different databases?Sometimes it is deemed desirable to keep only fairly recent data on the OLTP database, where the normal transaction activity happens, and replicate the data to another database that also contains historical data. This second database is used for heavy reporting tasks.
And "redesign"?As in, design it from scratch and do it right this time. Make sure all data relations are properly defined to Third Normal Form; make sure all data is typed properly (use DATE columns for dates, NUMBER columns for numbers, etc); make sure you have designed effective indexing; make sure you use the capabilities of the rdbms and do NOT just use it as a data dump.
See http://www.amazon.com/Effective-Oracle-Design-Osborne-ORACLE/dp/0072230657/ref=sr_1_3?s=books&ie=UTF8&qid=1301257486&sr=1-3
are they really good solutions?Like most everything else, "It depends"
It depends on if the proposed solutions are implemented properly and address the root problem. The root problem (or even perceived problem) hasn't yet been defined. You've just assumed that at some undefined point the database becomes "way-too-big" and will cause some sort of problem.
It's assumed that we don't have or can't use partitioning.
And why is that assumed? Yes, you have to have a version of Oracle that supports it, and it is an extra cost license. But like everything else, you and your management have to do a hard-nosed cost/benefit analysis. You may think you can't afford the cost of implementing partitioning, but it may be that you can't afford the expenses derived from NOT implementing it. I don't know what the case is for you, but you and your management should consider the factors instead of just rejecting in out of hand.
:):)...You are making me - a student so excited about the history. From slides rule to the moon....
Edited by: 847617 on Mar 27, 2011 10:01 AMEdited by: EdStevensTN on Mar 27, 2011 3:24 PM -
Administration/Database too big size
Dear all,
Our customer has a administration/database of a size of 4gb... without attachments, worddocs, exceldocs. etc.
Anybody know why the database has a big size and is there any way to shrink the database?
SBO Version: SBO2005A PL22
SQLServer: 2005
Thanks in advance,
ChiefI have just checked the tables of the customer and the results are
AITM: 12737
ADOC: 112112
ACRD: 11069
ADO1: 666508
AACP: 73
AACT: 277
AAD1: 46
AADM: 48
ACPR: 6502
ACR1: 20028
ACR2: 1959
ACRB: 50
ADO10: 2454
ADO6: 58911
ADP1: 33
AFPR: 84
AIT1: 49324
AITB: 36
AITT: 40
AITW: 6425
AJD1: 7908
AJDT: 3581
AKL1: 0
ALR1: 1987
ALR2: 6173
ALR3: 6173
ALT1: 3
AOB1: 3624
ARC2: 25673
ARC4: 5335
ARCT: 30204
ARI1: 18
ATC1: 2842
ATT1: 40
AUSR: 95
AWHS: 7
And how do you clean up the logs? Remove? -
Good morning.
I've just started exploring the possibility of making a photo album in iPhoto. When I place a photo on a full page, the photos are too big for the 'frame'- that is, the frame crops part of my photos off. I have tried using the resizing tool but find that the photos are already resized to their smallest.
Is there another way to deal with this problem?
Thanks,
BurntMonkeyJohn:
iPhoto puts up that warning when the resolution of the image falls below 180 dpi. A lot would depend on how far it falls below. They will not appear as good as they do on screen but that goes for all of the photos. The monitor display usually has more contrast and snap than the printed version. But that's not to mean the printed version doesn't or won't look good.
You can approximate the dpi of the photo for the frame that's giving the warning. The pages are 8.5 x 11. Estimate the size of the frame and divide the frame's dimension into the pictures corresponding pixel dimension.
What are the pixel dimensions of a typical photo that's giving you the warning sign and what page layout/frame are the in?
TIP: For insurance against the iPhoto database corruption that many users have experienced I recommend making a backup copy of the Library6.iPhoto (iPhoto.Library for iPhoto 5 and earlier versions) database file and keep it current. If problems crop up where iPhoto suddenly can't see any photos or thinks there are no photos in the library, replacing the working Library6.iPhoto file with the backup will often get the library back. By keeping it current I mean backup after each import and/or any serious editing or work on books, slideshows, calendars, cards, etc. That insures that if a problem pops up and you do need to replace the database file, you'll retain all those efforts. It doesn't take long to make the backup and it's good insurance.
I've created an Automator workflow application (requires Tiger or later), iPhoto dB File Backup, that will copy the selected Library6.iPhoto file from your iPhoto Library folder to the Pictures folder, replacing any previous version of it. There are versions that are compatible with iPhoto 5, 6, 7 and 8 libraries and Tiger and Leopard. Just put the application in the Dock and click on it whenever you want to backup the dB file. iPhoto does not have to be closed to run the application, just idle. You can download it at Toad's Cellar. Be sure to read the Read Me pdf file.
NOTE: The new rebuild option in iPhoto 09 (v. 8.0.2), Rebuild the iPhoto Library Database from automatic backup" makes this tip obsolete. -
Hey all,
My temp tablespace has grown too big (19Gb after I ran a particularly bad cart. join). I can't figure out how to get it cleaned up and I'd like my disk back. I've modified the max through OEM, but now it just shows that I'm using more than the max. Tried shutting down and restarting the service and the machine with no luck. How can I get SMON to do his job?
Win2K Pro
Oracle Database 10g Release 10.1.0.2.0
As always, thanks,
PeteThere's no automatic shrinking of temporary tablespace, Oracle says, this is a feature, not a bug. The workaround is : Create a new temporary tablespace with smaller size:
SQL> create temporary tablespace TEMP1 tempfile 'c:\temp01.dbf' size 100M extent management
local uniform size 128K;
-- Set new tablespace as default temporary tablespace for all users in database.
SQL> alter user <username> temporary tablespace TEMP1;
If you have a large number of users, the following SQL can be used to generate the code to change them all.
SQL> select 'alter user '||username||' temporary tablespace TEMP1;'
from dba_users;
-- Drop the old tablespace.
SQL> drop tablespace temp including contents. -
SQL server error log size is too big to handle
I am working with a large database on windows sql server 2008 R2 such that it has to run continuously 24x7 because of that it is not possible
to restart the server time to time. It is kind of monitoring system for big machines. Because of this SQL server error logs are growing too big even some times up to 60-70 GB at a limited sized hard drive. I can't delete them time to time manually. Can someone
please suggest a way using which I can stop creation of such error logs or recycle them after sometime. Most of the errors are of this kind --
Setting database option RECOVERY to simple for database db_name
P.S.- I have read limiting error logs to 6 etc. But that didn't help. It will be best if you could suggest some method to disable these logs.Hi Mohit11,
According to your description, your SQL Server error logs are growing too big to handle at a limited sized hard drive, and you want to know how to stop the generation of such error logs or recycle them after sometime automatically without restarting the
SQL Server, right?
As others mentioned above, we may not be able to disable SQL server error log generation. However we can recycle the error logs automatically by running the
sp_cycle_errorlog on a fixed schedule (i.e. every two weeks) using SQL agent jobs so that the error logs will be recycled
automatically without restarting SQL Server.
And it is also very important for us to keep the error log files more readable. So we can increase the number of error logs a little more and run the sp_cycle_errorlog more frequently (i.e. daily), then each file will in a smaller size to be more readable
and we can recycle the log files automatically.
In addition, in order to avoid the size of all the log files growing into a too big size unexpected (sometime it may happen), we can run the following query in SQL Agent job to automatically delete all the old log files when the size of log files is larger
than some value we want to keep (i.e. 30GB):
--create a tample table to gather the information of error log files
CREATE TABLE #ErrorLog
Archieve INT,
Dt DATETIME,
FileSize INT
GO
INSERT INTO #ErrorLog
EXEC xp_enumerrorlogs
GO
--delete all the old log files if the size of all the log files is larger than 30GB
DECLARE @i int = 1;
DECLARE @Log_number int;
DECLARE @Log_Max_Size int = 30*1024; --here is the max size (M) of all the error log files we want to keep, change the value according to your requirement
DECLARE @SQLSTR VARCHAR(1000);
SET @Log_number = (SELECT COUNT(*) FROM #ErrorLog);
IF (SELECT COUNT(FileSize/1024/1024) FROM #ErrorLog) >= @Log_Max_Size
BEGIN
WHILE @i <= @Log_number
BEGIN
SET @SQLSTR = 'DEL C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Log\ERRORLOG.' + CONVERT(VARCHAR,@i);
EXEC xp_cmdshell @SQLSTR;
SET @i =@i + 1;
END
END
DROP TABLE #ErrorLog
For more information about How to manage the SQL Server error log, please refer to the following article:
http://support.microsoft.com/kb/2199578
If you have any question, please feel free to let me know.
Regards,
Jerry Li -
DBIF_RSQL_INVALID_RSQL statement too big
Hi All,
We are on SAP R/3 4.6C (kernel 46D 64 bit pacth 2113 with last DBLS library, AIX 5.2, DB2 7.1 on z/os 1.4) and for a BW extraction we have the following dump:
"ABAP runtime error DBIF_RSQL_INVALID_RSQL, RSQL error 13 occurred.
The following message also appears in the developer trace:
B *** ERROR => dbtran ERROR (set_input_da_spec): statement too big
B marker count = 1195 > max. marker count = 726."
For the OSS note 655018 the limit of marker count is 2000, but in the trace of work process we read 726.
Many thanks for your help
BobHi Bernhard,
thak you for reply. This is the dump:
============================================================
ABAP runtime errors DBIF_RSQL_INVALID_RSQL
Occurred on 16.01.2008 at 10:29:04
RSQL error 13 when accessing table "EKKO ".
What happened?
The current ABAP/4 program "SAPDBERM " had to be terminated because
one of the statements could not be executed.
This is probably due to an error in the ABAP/4 program.
What can you do?
Note the actions and input that caused the error.
Inform your SAP system administrator.
You can print out this message by choosing "Print". Transaction ST22
allows you to display and manage termination messages, including keeping
them beyond their normal deletion date.
Error analysis
The SQL statement generated from SAP Open SQL violates a
restriction imposed by the database system used in R/3.
For details, refer to either the system log or the developer trace.
Possible reasons for error:
o Maximum size of an SQL statement exceeded.
o The statement contains too many input variables.
o The input data requires more space than is available.
o ...
You can usually find details in the system log (SM21) and in
the developer trace of the relevant work process (ST11).
In the event of an error, the developer trace often gives the
current restrictions.
How to correct the error
The SAP Open SQL statement concerned must be divided into several
smaller units.
If the problem occurred because an excessively large table was used
in an IN itab construct, you can use FOR ALL ENTRIES instead.
When you use this addition, the statement is split into smaller units
according to the restrictions of the database system used.
System environment
SAP Release.............. "46C"
Application server....... "R3PRD"
Network address.......... "172.24.10.50"
Operating system......... "AIX"
Release.................. "5.2"
Hardware type............ "00C7A3EC4C00"
Database server.......... "r3prddb_gb"
Database type............ "DB2"
Database name............ "PRD"
Database owner........... "SAPR3"
Character set............ "en_US.ISO8859-1"
SAP kernel............... "46D"
Created on............... "Aug 25 2005 20:51:50"
Created in............... "AIX 1 5 00447C4A4C00"
Database version......... " "
Patch level.............. "2113"
Patch text............... " "
Supported environment....
Database................. "DB2 for OS/390 6.1"
SAP database version..... "46D"
Operating system......... "AIX 1 4, AIX 2 4, AIX 3 4, AIX 1 5, AIX 2 5, AIX 3
5, , System build information:, -
, LCHN :
775484"
User, transaction...
Client.............. 300
User................ "ALEREMOTE"
Language key........ "E"
Transaction......... " "
Program............. "SAPDBERM "
Screen.............. "SAPMSSY0 1000"
Screen line......... 6
Information on where termination occurred
The termination occurred in the ABAP/4 program "SAPDBERM " in
"PUT_EKKO".
The main program was "AQZZSYSTBWGENER0SY000000000361 ".
The termination occurred in line 333
of the source code of program "SAPDBERM " (when calling the editor 3330).
The program "SAPDBERM " was started as a background job.
Source code extract
003030 ENDFORM.
003040
003050 ----
003060 FORM PUT_EKKO.
003070
003080 CONSTANTS: LC_PACKAGE_SIZE LIKE SY-TFILL VALUE 10.
003090 DATA: BEGIN OF LT_SEKKO OCCURS 0,
003100 EBELN LIKE EKKO-EBELN,
003110 END OF LT_SEKKO.
003120 DATA: BEGIN OF LT_TEKKO OCCURS 0,
003130 EBELN LIKE EKKO-EBELN,
003140 END OF LT_TEKKO.
003150 DATA: L_INDEX_FROM LIKE SY-TABIX,
003160 L_INDEX_TO LIKE SY-TABIX,
003170 L_EKPO_TABIX LIKE SY-TABIX.
003180 PERFORM KSCHL_SETZEN.
003190 ONEST = P_ONEST.
003200 TEST = TESTLAUF.
003210 DB_DEL = ' '. "VETVG und Komponenten bleiben noch stehen
003220 DETPROT = P_PROT.
003230 ANDAT = ER_ANDAT.
003240 PERFORM AKTIVITAET_SETZEN(SAPFM06D) USING '06'.
003250 SELECT EBELN FROM EKKO INTO TABLE LT_SEKKO
003260 WHERE EBELN IN ER_EBELN
003270 AND EKORG IN ER_EKORG
003280 AND BSTYP IN ER_BSTYP
003290 AND BEDAT IN ER_BEDAT
003300 AND BSART IN ER_BSART
003310 AND EKGRP IN ER_EKGRP
003320 * AND MEMORY NE 'X'
> ORDER BY EBELN.
003340 L_INDEX_FROM = 1.
003350
003360 DO.
003370 REFRESH LT_TEKKO.
003380 L_INDEX_TO = L_INDEX_FROM + LC_PACKAGE_SIZE - 1.
003390 LOOP AT LT_SEKKO FROM L_INDEX_FROM
003400 TO L_INDEX_TO.
003410 APPEND LT_SEKKO TO LT_TEKKO.
003420 ENDLOOP.
003430 IF SY-SUBRC NE 0.
003440 EXIT.
003450 ENDIF.
003460 SELECT * FROM EKKO INTO TABLE XEKKO
003470 FOR ALL ENTRIES IN LT_TEKKO
003480 WHERE EBELN = LT_TEKKO-EBELN
003490 ORDER BY PRIMARY KEY.
003500 SELECT * FROM EKPO INTO TABLE XEKPO
003510 FOR ALL ENTRIES IN LT_TEKKO
003520 WHERE EBELN = LT_TEKKO-EBELN
Contents of system fields
SY field contents..................... SY field contents.....................
SY-SUBRC 0 SY-INDEX 0
SY-TABIX 1 SY-DBCNT 152
SY-FDPOS 1 SY-LSIND 0
SY-PAGNO 0 SY-LINNO 1
SY-COLNO 1
============================================================
We have also installed the last kernel and the DBSL 2342.
Thank you and regards
Bob -
CatalogData folder too big on passive server in DAG
I have a two Exchange 2010 server DAG. One is active and the other is mounted but passive.
On at least two of the replicated and mounted databases on the passive server, the CatalogData folder is too big.
I noticed inside of them, they have old .ci and .dir files from last year.....really long ago and I need to get rid of them.
Can I delete them safely without affecting the DAG and/or replication?
What is the best way to do this on the passive server without affecting any mail flow or the DAG?Simply stop the Exchange Indexing service on the server and delete the catalog folder and restart the indexing service. Or update-mailboxdatabasecopy (ServerCopy) -catalogonly
Twitter!: Please Note: My Posts are provided “AS IS” without warranty of any kind, either expressed or implied. -
Aperure 3 library too big. crashes.
how big is too big for an aperture library to run smoothly?
is 200gb too big for my everyday library of images, or should that not be a problem on my 1 year old i5 quad imac with 16gb of ram?
thanks for the help.more...
I recommend using a Referenced-Masters workflow.
It works great for 200k+ images, the Library with its Previews lives on the internal drive and is always accessible. Masters live on external drives. The Library is backed up via Vaults and originals are backed up to redundant locations using the Finder before import into Aperture.
Aperture is designed to bite into small chunks at a time, so TBs of data do not bother the app itself. However, handling super-large batches of data like hundreds of GB batches on consumer hardware tends to be problematic, it is that simple.
Slower speeds seem to exacerbate handling large data chunks.
IMO referenced Masters make far more sense than building huge managed-Masters Libraries. With referenced Masters one has no need to copy a 1.5 TB sized file. I find that even (2011 MBP) copying 5-15 GB-sized batches of RAW/JPEG files copying fails with some frequency, enough so that I always verify the copy.
• Hard disk speed. Drives slow as they fill so making a drive more full (which managed Masters always does) will slow down drive operation.
• Database size. Larger databases are by definition more prone to "issues" than smaller databases are (the Aperture Library is the db in this case).
• Vaults. Larger Library means larger Vaults, and Vaults are an incremental repetitive backup process, so again larger Vaults are by definition more prone to "issues" than smaller Vaults are. One-time backup of Referenced Masters (each file small, unlike a huge managed-Masters DB) is neither incremental nor ongoing; which is by definition a more stable process.
Managed-Masters Libraries can work, but they cannot avoid the basic database physics.
Note that whether managed or referenced, original images should be separately backed up prior to import into Aperture or any other images management application. IMO after backing up each batch of original images importing that batch into Aperture as a new Project by reference makes by far the most sense. Building a huge managed Library or splitting into multiple smaller Libraries is less logical.
HTH
-Allen -
Tiny video TOO BIG for YouTube?
I'm sorry--I searched for "compress video" and got no results--so--
I have a very short (1.09 minute) video I want to upload to YouTube. It is 122 MB. YouTube has a 100 MB maximum capacity. I've seen 10 minute videos on YouTube, yet my tiny 1 minute video is too big. How do I make it smaller so it will upload? I'm so sorry if this has been discussed before, but I get no help for this on my Mac Help section--thank you!If you have Quicktime Pro you can export it and reduce the size. Or use VisualHub in its demo mode (2 minutes max) to covert to a smaller format and size. It works quite good.
Do you Twango?
TIP: For insurance against the iPhoto database corruption that many users have experienced I recommend making a backup copy of the Library6.iPhoto database file and keep it current. If problems crop up where iPhoto suddenly can't see any photos or thinks there are no photos in the library, replacing the working Library6.iPhoto file with the backup will often get the library back. By keeping it current I mean backup after each import and/or any serious editing or work on books, slideshows, calendars, cards, etc. That insures that if a problem pops up and you do need to replace the database file, you'll retain all those efforts. It doesn't take long to make the backup and it's good insurance.
I've written an Automator workflow application (requires Tiger), iPhoto dB File Backup, that will copy the selected Library6.iPhoto file from your iPhoto Library folder to the Pictures folder, replacing any previous version of it. You can download it at Toad's Cellar. Be sure to read the Read Me pdf file. -
Temp tablespace grow too big???
we have EBS R12.1 on LINUX X86_64 with ORACLE database version 11.1.0.7. every week we have temporary tablespace grow too big (> 32 GB) and out of extent.
Based on our research some some sql statement or report cause this issue. if we 'analyze stratistics", most time problem can fix. It cause us some time need run "analyze statistics" several times in one day.
Does anyone have solution for this?
Thanks.Please see if these docs help.
Temporary Segments: What Happens When a Sort Occurs [ID 102339.1]
Queries to monitor Temporary Tablespace usage [ID 289894.1]
How Can Temporary Segment Usage Be Monitored Over Time? [ID 364417.1]
Thanks,
Hussein -
How big is too big for .properties file for a ResourceBundle?
The subject says it all. We are using ResourceBundles to manage the dictionary of English/French literals for our web enabling project. Our .properties file is currently 675 key/value pairs. How big can .properties files grow before they are too big?
Thanks,
Don Booker - Programming Team Leader
Common Departmental Financial System
Public Works & Government Services CanadaThe resource gets loaded into memory, so there's a memory cost that increases with the number of properties. (There's also a time cost when it loads, but that's only once.) And finding one of the properties requires a search, so there's a processing cost involved. But it's based on a Hashtable, so it isn't a major cost even if the number of properties gets very large.
So there's your constraints. The question is, though, what is your alternative when the resource does get too big? A database? That reduces the memory cost but increases the processing cost. I would guess you could let your properties file grow a lot before you were forced to go to a database, but actual testing would help. -
I can't figure out how to downsize my photos for a website that won't take anything above 3MB my Photos are 4MB. Even when I send them in an email, they are too big to send more than 3 at a time and I don't know how to fix it! Help says that when I select a photo it gives me the option to downsize. It does NOT?????
This is the iPhoto for iOS forum. Are you asking about iPhoto on your Mac?
In iPhoto on your Mac select Export and you will have the option to select a size for your photo.
In iPhoto for iOS this capability is not there. I use PhotoForege2 on my iPad to resize my photos to meet the requirements you describe. -
Saving animated gif error - canvas size too big?
Hi,
i have been asked to create an animated gif which has a snow effect in the background. The animation runs pretty well using the pattern style to move the animated snow down. The problem is when I try to export the animation the save for web dialogue window freezes and I can't quit photoshop. I have tried to make the canvas (image size) smaller and it works.
Is there a memory problem with creating animations in Photoshop if the document size is too big, im running a 3000x1000px at 72dpi
TThe animation would be used on a web app, which a script could probably do a better job, but I need to try this first.
thanks
listerSomething is very wrong here. The point behind using GIFs is that they are small and OK to put on a Web site.
First thing: forget all about PPI - this has absolutely no relevance to Web/screen images; it is only used for printing. You need to think about the pixel size.
So, how big in pixels are your starting images?
How big do you want the resulting GIF to be? Again, you have to decide in pixels, as the viewing size will depend upon screen resolution of the person looking at it. You can use a notional PPI of 70-100 to guess how big it will be in inches, but that's all it is - a guess.
Is this getting you going in the right direction?
I cannot resist commenting that I hate almost all animated GIFs as they reduce Web site usability so much.
Maybe you are looking for
-
Dropdown won't show (combobox, Menus)
Hi Everyone, I have a problem that's getting on my nerve just a bit. I'm working on a EMR app and would like to display a PDF file in my Flex app. I'm using the HTML component to display it which works good so far, but my current version of the scree
-
Optimal use of iTunes with Apple TV and Time Capsule
I'm undertaking a major "infrastructure" upgrade. This will involve Time Capsule and Apple TV. My current environment consists of 1 PowerBook and 1 Desktop PC running WIN XP. I'm running on a wireless network based off of an old Linksys Gateway (G-Ca
-
Hi, That is my problem http://stackoverflow.com/questions/28991715/list-files-and-folders-in-office-365-sharepoint-site http://stackoverflow.com/questions/27226995/retrieve-sharepoint-site-collection-list-using-restful-api-in-office-365 http://sharep
-
LabVIEW applicatio​n fails to run properly at Windows startup
I have experienced problems with my Labview application after putting it in the Windows XP startup folder. The application runs perfectly if I start it manually but whenever I get Windows XP to start it after boot-up, I see various error messages. My
-
I am finding it time consuming sending an email to a group. I can't seem to find a way to click the group title and have to add each member individually. Can anyone advise me please