Number Ranges consideration while Data Load
Hello team
we are upgrading from 4.6 to ecc 6.0. abaper is gonna try test load..I already configured number ranges in new system with respect to what is in 4.6. should the number ranges be deleted before the data load tesing? How will the system react if number ranges are already there?
Thanks
If u r number range series are not Changes than there will be no issues
As u will mention the seiries of No Ranges in PA04 and set that series no as default value in NUMKR
in case if u miss any of the above things in customisation than we may get of mismatch of No
say u have a series with 100 200 but in the number ranges u given 200 300 than it may leads to trouble
i never come across this situtation but this is just an INFO
Similar Messages
-
Number Range Buffering while Loading
Hi all,
does anyone know whether it is possible to change the buffer size of number range objects in BW while these number range objects are being accessed by loads?
Thanks
GerritHi there,
I wrote a program adapted on another code that was present here in SDN and use it in process chain to change the BID number range, and after the load, I use the same program to revert it.
You pass the BID as parameter, the number to buffer the BID and the option to add to memory or do not use memory buffering.
Here it is the code:
*& Report Z_GET_NUMBER_RANGE_INFO_CUBE
*& Author: Rajiv Deshpande Date: August 01 2006
*& Description: This program changes the value for selected Number Range
*& Input: Info Cube Name, No. of Numbers in Memory and radio button
*& selection Add / Delete (Main Memory Buffering)
*& Output: Displays the List of Dimensions in the given cube with
*& additional information like the Number Range #, Current
*& Status.The Number Range # has a HotSpot when clicked calls a
*& BDC program to do the user defined activity (Add/Delte (Main
*& Memory Buffer)).
report z_buffer_number_range line-size 225.
tables: nriv, tnro.
type-pools: rzd1.
constants: nodata type c value '/'. "nodata
parameter: BID like nriv-object obligatory,
*parameter: p_cube like rsdcube-infocube obligatory,
p_number like tnro-noivbuffer obligatory,
p_add radiobutton group grp1,
p_del radiobutton group grp1.
data:
opt type ctu_params,
c_cube type rsd_infocube,
c_number(8) type c,
bdc_obj like nriv-object,
l_escube type rsd_s_cube,
l_etdime type rsd_t_dime,
bdcdata like bdcdata occurs 0 with header line.
start-of-selection.
* c_cube = p_cube.
c_number = p_number.
bdc_obj = BID.
* perform get_the_cube_info.
* perform write_info.
end-of-selection.
*at line-selection.
* bdc_obj = sy-lisel+81(10).
perform fill_bdc using bdc_obj c_number p_add p_del.
perform call_tcode.
**& Form get_the_cube_info
** Gets CUBE Information
*form get_the_cube_info .
* call function 'RSD_CUBE_GET'
* exporting
* i_infocube = c_cube
* i_objvers = 'A'
* i_bypass_buffer = 'X'
** I_WITH_ATR_NAV = RS_C_FALSE
* importing
* e_s_cube = l_escube
** E_S_TBHD =
** E_TBHD_NOT_FOUND =
** E_T_CUBET =
** E_T_CUBE_IOBJ =
** E_T_CUBE_DIME =
* e_t_dime = l_etdime
** E_T_DIMET =
** E_T_DIME_IOBJ =
** E_T_ICHA_PRO =
** E_T_IKYF_PRO =
** E_T_IC_VAL_IOBJ =
** E_T_CUBE_PART =
** E_T_CUBE_PART_IOBJ =
** E_T_MULTI_IOBJ =
* exceptions
* infocube_not_found = 1
* illegal_input = 2
* others = 3.
* if sy-subrc <> 0.
* case sy-subrc.
* when '1'.
* write: 'Info cube :', c_cube, 'not found.'.
* endcase.
* endif.
*endform. " get_the_cube_info
*& Form write_info
* Writes the Cube Information, also has HotSpot defined on the Number
* Range #.
*form write_info .
* data: begin of i_time occurs 0.
* include structure rsd_s_dime.
* data: end of i_time.
* data: value type i,
* value1 type i.
* if p_add = 'X'.
* format color 5.
* write:/, 'ADD MAIN MEMORY MODE'.
* format color off.
* else.
* format color 6.
* write:/, 'DELETE MAIN MEMORY MODE'.
* format color off.
* endif.
* write:/,'Cube Name:', c_cube.
* write:/, 00 'DIM',
* 12 'DIM TEXT',
* 38 '#',
* 43 'TYPE',
* 52 'DIM TABLE NAME',
* 82 'Number Range #',
* 100 'Current Status',
* 115 'Buffer(Y/N)',
* 130 '# in Memory'.
* uline.
* loop at l_etdime into i_time.
* case i_time-iobjtp.
* when 'DPA' or 'TIM' or 'UNI'.
* continue.
* endcase.
* select single * from nriv client specified
* where client = sy-mandt
* and object = i_time-nobject.
* if sy-subrc = 0.
* select single * from tnro where object = i_time-nobject.
* endif.
* value = nriv-nrlevel.
* value1 = tnro-noivbuffer.
* write: /
** i_time-infocube,r
** i_time-objvers,
* i_time-dimension(10) under 'DIM',
** i_time-langu,
** i_time-txtsh,
* i_time-txtlg(25) under 'DIM TEXT',
* i_time-posit under '#',
* i_time-iobjtp under 'TYPE',
** i_time-linitfl,
** i_time-highcard,
* i_time-tablnm under 'DIM TABLE NAME',
** i_time-tstpnm,
** i_time-timestmp,
** i_time-numbranr,
* i_time-nobject under 'Number Range #' hotspot on
* quickinfo 'Change' color 3,
* value under 'Current Status' decimals 0,
* tnro-buffer under 'Buffer(Y/N)',
* value1 under '# in Memory'.
** i_time-ddstate.
* endloop.
* uline.
*endform. " write_info
*& Form fill_bdc
* Fills the BDC, depending on the flag (Add/Delete)
* --> Object #, Number, Add Flag, Delete Flag
form fill_bdc using object number add del.
perform bdc_dynpro using 'SAPMSNRO' '0150'.
perform bdc_field using 'BDC_CURSOR' 'NRIV-OBJECT'.
perform bdc_field using 'BDC_OKCODE' '=UPD'.
perform bdc_field using 'NRIV-OBJECT' object.
if add = 'X'.
perform bdc_dynpro using 'SAPLSNR2' '0100'.
perform bdc_field using 'BDC_OKCODE' '=PUF1'.
perform bdc_dynpro using 'SAPLSNR2' '0100'.
perform bdc_field using 'BDC_CURSOR' 'TNRO-NOIVBUFFER'.
perform bdc_field using 'BDC_OKCODE' '=SAVE'.
perform bdc_field using 'TNRO-NOIVBUFFER' number.
else.
perform bdc_dynpro using 'SAPLSNR2' '0100'.
perform bdc_field using 'BDC_OKCODE' '=NONE'.
perform bdc_dynpro using 'SAPLSNR2' '0100'.
perform bdc_field using 'BDC_OKCODE' '=SAVE'.
endif.
perform bdc_dynpro using 'SAPLSPO1' '0300'.
perform bdc_field using 'BDC_OKCODE' '=YES'.
endform. " fill_bdc
*& Form call_tcode
* Call the TCODE SNRO with Display Mode as N (No Display) &
* Default Size.
*& Form call_tcode
* text
form call_tcode .
data: l_msg(25).
concatenate 'Changing ' bdc_obj into l_msg separated by space.
opt-dismode = 'N'.
opt-defsize = 'X'.
call function 'SAPGUI_PROGRESS_INDICATOR'
EXPORTING
percentage = 75
text = l_msg.
call transaction 'SNRO' using bdcdata options from opt.
if sy-subrc <> 0.
write:/ 'System is Bloced to make any changes'.
exit.
endif.
refresh bdcdata.
clear bdcdata.
endform. " call_tcode
* Start new screen *
form bdc_dynpro using program dynpro.
clear bdcdata.
bdcdata-program = program.
bdcdata-dynpro = dynpro.
bdcdata-dynbegin = 'X'.
append bdcdata.
endform. "bdc_dynpro
* Insert field *
form bdc_field using fnam fval.
if fval <> nodata.
clear bdcdata.
bdcdata-fnam = fnam.
bdcdata-fval = fval.
append bdcdata.
endif.
endform. "bdc_field
Diogo. -
Hi Gurus,
I am getting error while data loading. At BI side when I check the error it says Background Job Cancelled and when I check in R3 side I got the following error
Job started
Step 001 started (program SBIE0001, variant &0000000065503, user ID R3REMOTE)
Asynchronous transmission of info IDoc 2 in task 0001 (0 parallel tasks)
DATASOURCE = 2LIS_11_V_ITM
Current Values for Selected Profile Parameters *
abap/heap_area_nondia......... 2000683008 *
abap/heap_area_total.......... 4000317440 *
abap/heaplimit................ 40894464 *
zcsa/installed_languages...... ED *
zcsa/system_language.......... E *
ztta/max_memreq_MB............ 2047 *
ztta/roll_area................ 6500352 *
ztta/roll_extension........... 3001024512 *
4 LUWs confirmed and 4 LUWs to be deleted with function module RSC2_QOUT_CONFIRM_DATA
ABAP/4 processor: DBIF_RSQL_SQL_ERROR
Job cancelled
Please help me out what should I do.
Regards,
MayankHi Mayank,
The log says it went to short dump due to temp space issue.as its the source system job ,check in the source system side for temp table space and also check at the BI side as well.
Check with your basis regarding the TEMP PSA table space - if its out of space ask them to increase the table space and try to repeat the load.
Check the below note
Note 796422 - DBIF_RSQL_SQL_ERROR during deletion of table BWFI_AEDAT
Regards
KP
Edited by: prashanthk on Jul 19, 2010 10:42 AM -
Error in Unit conversion while data loading
Hi,
I have maintained DSO in Material (0MATERIAL) info object > Bex tab > Base unit of measure > ZDSO; Then loaded this ZDSO from std data source 0MAT_UNIT_ATTR so that all conversion factors into different units which are maintained in material master data in ECC would get loaded to this DSO.
Then I have created one Conversion type (ZCON) to read source unit from record and convert it to fixed unit "ST" with reference info object as 0MATERIAL. ST is customized UOM here.
I am using ZCON conversion type to convert Qty in base UOM to Qty in ST in Bex reports under conversion tab of key figure. Now as this is std functionally, conversion would automatically takes place using ZDSO (mentioned above) as source and target UOM are not from same dimension (Base UOM is EA and target UOM is ST).
If conversion factor to ST is not found in ZDSO then conversion to base UOM would happen automatically. Now this functionality is happening perfectly in Bex but its giving error if I use the same conversion type ZCON while data loads. Its giving error for those material for which ST conversion is not maintained. But when its not maintained by default it should convert it to base UOM, but its not converting and giving error in data loads.
Hope I am able to explain the issue.
Please help me on on this issue or any way around.
Thanks in advance.
PrafullaGanesh,
Can you please check out the Alpha Conversion Routine and also nodeid for that infoobject..
There might be some inconsistencies in the code..
Hope it helps
Gattu -
Hello Friends,
I an having a query regarding performance in BW 3.5.
while data load from R3 to BW we have 4 options.
Only PSA
Only Data Target
PSA and later data target
PSA and Data target in parallel.
In system performance point of view, which is the best option with less system resources and How.
Your help aprreciated.
Thanks
TonyHi ,
for performance point of view ..
PSA and later data target will be better option ..
for more info check this link ..
http://help.sap.com/saphelp_nw04/Helpdata/EN/80/1a6567e07211d2acb80000e829fbfe/frameset.htm
Regards,
shikha -
Error while data loading in real time cube
HI experts,
I have a problem. I am loading data from a flat file.The data is loading correctly till the DSO but when i am trying to load it into the cube it is giving an error.
The cube is a real time cube for PLANING. I have chnaged the status to allow data loading but still the DTP is giving an error.
It shows an error "error while extracting from DataStore" and some RSBK 224 ERROR and rsar 051 error.What was the resolution to this issue. We rae having the same issue only with external system (not a flat file). We get the RSAR 051 with a return code of 238 error message, like it is not even getting to the rfc connection (DI_SOURCE). We have been facing this issue for a while and even opened up a message with SAP.
-
Short Dump Error While Data load
Hi Experts,
Data load to an ODS in production server has been failed because of Short dump error. The error message shows " OBJECTS_OBJREF_NOT_ASSIGNED ".
Request you to please help me out with a solution ..
Regards,
VijayHi Vijay,
follow the steps below (May Help you)
Goto Monitor screen>Status Tab> using the wizard or the menu path >Environment -> Short dump> In the warehouse
Select the Error and double click
Analyse the error from the message.
1.-->Go to Monitor
-->Transactional RFC
-->In the Warehouse
-->Execute
-->EDIT
-->Execute LUW
Refresh the Transactional RFC screen and go to the Data Target(s) which have failed and see the status of the Request-->Now it should be green.
2.In some cases the above process will not work (where the Bad requests still exists after the above procedure).
In that case we need to go to the Data Targets and need to delete the bad requests and do the update again
Regards,
BH -
Message while data load is in progress
Hi,
We are loading the data in the infocube every day twice i.e in the morning and the afternoon.
Loading methodology is always delete and load. At any given point we have only one request in the cube. Data Load takes arround 20-30 minutes.
When the user runs the query during the data load, he gets the message 'No Applicable Data Found'. Can anyone please advise, how do we show the proper message like 'Data is updating in the system..Please try after some time...' etc.
We are using BEx Browser with a template and a query attached to the template.
Please advise.
Regards
Ramesh GanjiHi,
Tell the time of the data load to the users so that they are aware that the loads are in progress and data will not be available for reportting as of now and prohibit themselves from running the report give a buffer time of around 15-20 mins as there might be some issue some where down the line. Ask them to run the report other than the time data loads are happening
You could also reschedule the timings of the process chain to finish before the users comes in.
As far as the functionaly you are referring to i am not sure if we are able to acheive this..
Regards
Puneet -
Disable a portlet while data loads
Hi,
Is there a way to disable a portlet programatically? We load data at certain times of the day and would like to temporarily disable the portlets. Is there a call in the Soap API that I could use to do this?
Thanks
RickThanks for responding
We are looking for a way to have the process that makes the changes disable the appropriate gadgets using the soapapi. In 4.5 we did it with a stored proc. This allows the folks modifying the data decide when they need to shut off access. I agree that this is an option but would mean modifying lots of gadget code versus one data load service.
Thanks
Rick -
How to skip an entire data packet while data loading
Hi All,
We want to skip some records based on a condition while loading from PSA to the Cube, for which we have written a ABAP code in Start Routine .
This is working fine.
But there is a Data packet where all the records are supposed to be skipped and here it is giving Dump and Exception CX_RSROUT_SKIP_RECORD.
The ABAP Code written is
DELETE SOURCE_PACKAGE WHERE FIELD = 'ABC' .
And for a particular data packet all the records satisfy the condition and gets deleted.
Please advice how to skip the entire data packet if all the reocrs satisfy the condition to be deleted and handle the exception CX_RSROUT_SKIP_RECORD .
Edited by: Rahir on Mar 26, 2009 3:26 PM
Edited by: Rahir on Mar 26, 2009 3:40 PMHi All,
The Dump I am getting is :
The exception 'CX_RSROUT_SKIP_RECORD' was raised, but it was not caught
anywhere along
the call hierarchy.
Since exceptions represent error situations and this error was not
adequately responded to, the running ABAP program 'GPD4PXLIP83MFQ273A2M8HU4ULN'
has to be terminated.
But this comes only when all the records in a particular Data Packet gets skipped.
For rest of the Data Packets it works fine.
I think if the Data Packet(with 0 records) itself can be skipped this will be resolved or the Exception will be taken care of.
Please advice how to resolve this and avoid 'CX_RSROUT_SKIP_RECORD' at earliest .
Edited by: Rahir on Mar 27, 2009 6:25 AM
Edited by: Rahir on Mar 27, 2009 7:34 AM -
Error while data loading in BI
Hi gurus,
Our BI team is unable to load data from BI staging to BI Target.There is no data in BI Targets because of this Users can not test the BI reports. when they run it the header file status shows yellow instead of green.
Please help.
Regards,
Priyanshu SrivastavaThe problem is that jobs logs cannot be written. for example for job
BIDTPR_6018_1 .
sm51
M *** ERROR => ThCallHooks: event handler rsts_before_commit for event
sm21 F6 1 TemSe input/output to unopened file.
Can anybody tell how to resolve that.
Regards,
Priyanshu Srivastava -
Update rule problem - while data load
Hi friends,
I got the following error while doing initialisation for 2lis_02_sgr.
"ABORT was set in the customer routine 9998
Error 1 in the update "
In the forum i searched for this error and this error is something related to the start routine in my update rule.
But i dont know whats wrong with my routine.
Im giving the start routine below,pls go through this and give me your suggestions..
PROGRAM UPDATE_ROUTINE.
$$ begin of global - insert your declaration only below this line -
TABLES: ...
<i>TABLES /bic/AZMM_PUR100 .
DATA: T_PUR1 LIKE /bic/AZMM_PUR100 OCCURS 0 WITH HEADER LINE.</i>
$$ end of global - insert your declaration only before this line -
The follow definition is new in the BW3.x
TYPES:
BEGIN OF DATA_PACKAGE_STRUCTURE.
INCLUDE STRUCTURE /BIC/CS2LIS_02_SGR.
TYPES:
RECNO LIKE sy-tabix,
END OF DATA_PACKAGE_STRUCTURE.
DATA:
DATA_PACKAGE TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
WITH HEADER LINE
WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
FORM startup
TABLES MONITOR STRUCTURE RSMONITOR "user defined monitoring
MONITOR_RECNO STRUCTURE RSMONITORS " monitoring with record n
DATA_PACKAGE STRUCTURE DATA_PACKAGE
USING RECORD_ALL LIKE SY-TABIX
SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
CHANGING ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
$$ begin of routine - insert your code only below this line -
fill the internal tables "MONITOR" and/or "MONITOR_RECNO",
to make monitor entries
if abort is not equal zero, the update process will be canceled
CLEAR: T_PUR1[] ,
T_PUR1,
ABORT.
SELECT * INTO TABLE T_PUR1 FROM /bic/AZMM_PUR100.
IF SY-SUBRC EQ 0.
SORT T_PUR1 BY DOC_DATE
DOC_ITEM
DOC_NUM.
ELSE.
MONITOR-msgid = sy-msgid.
MONITOR-msgty = sy-msgty.
MONITOR-msgno = sy-msgno.
MONITOR-msgv1 = sy-msgv1.
MONITOR-msgv2 = sy-msgv2.
MONITOR-msgv3 = sy-msgv3.
MONITOR-msgv4 = sy-msgv4.
append MONITOR.
if abort is not equal zero, the update process will be canceled
ABORT = 1.
ENDIF.
ABORT = 0.
$$ end of routine - insert your code only before this line -
ENDFORM.
Thanks & Regards
Raguthanks gimmo and a.h.p,
i have done the correction as you said,pls verify that.
And also kindly explain me what is the reason for this start routine,what exactly it does???
CLEAR: T_PUR1[] ,
T_PUR1,
ABORT.
SELECT * INTO TABLE T_PUR1 FROM /bic/AZMM_PUR100.
IF SY-SUBRC EQ 0.
SORT T_PUR1 BY DOC_DATE
DOC_ITEM
DOC_NUM.
abort = 0. ( added abort = 0 as per your suggestion )
ELSE.
MONITOR-msgid = sy-msgid.
MONITOR-msgty = sy-msgty.
MONITOR-msgno = sy-msgno.
MONITOR-msgv1 = sy-msgv1.
MONITOR-msgv2 = sy-msgv2.
MONITOR-msgv3 = sy-msgv3.
MONITOR-msgv4 = sy-msgv4.
append MONITOR.
if abort is not equal zero, the update process will be canceled
ABORT = 1.
exit. ( added exit as per your suggestion )
ENDIF.
ABORT = 0.
$$ end of routine - insert your code only before this line -
ENDFORM.
Thanks & Regards
ragu -
Request failing while data loading
Hi Experts,
I am extracting some data for a data source 0CO_PC_01 upto PSA but the load is failing everytime with a message "Error occurred in data selection".
When I checked the source system there is a short dump with error " Source no does not exist".
This is happening when I am loading for fisc per > 001.2014. Prior to this period the loads are getting successful. It's a FULL load.
Do you have any insights on this issue "Source no. does not exist ". Any insights/ideas will be useful.
Regards,
NaveenHi Naveen ,
I had similar issue, starting of this year . Can you please check if you have done initialization for this year for this datasource .
Go to Scheduler<initialization option for source system .
Check if the fiscal year has 001.2014 - 012.2014 .
If you don't have this initialized for this year , you will not be able to pull this year's data for this data source in BW . Then in that case you need to re-initialize for fiscal year 001.2014 - 012.2030 , so that you don't need to keep doing this activity every year .
Please check and let me know so that i can share the next steps .
Thanks
Zeenath -
Increse No of BGP while data load and how to bypass the DTPin Process Chain
Hello All,
We want to improve the performance of the loads. Currently we are loading the data from external Data Base though DB link. Just to mention we are on BI 7 system. We are by passing the PSA to load the data quickest. Unfortunately we cannot use PSA. Because loads times are more when we use PSA. So we are directly accessing views on external data base. Also external data base is indexed as per our requirement.
Currently our DTP is set to run on the 10 parallel processes (on DTP settings for batch Batch Manager with job class A). Even though we set to 10 we can see loads are running on 3 or 4 Back ground parallel processes only. Not sure why. Does any one know why it is behaving like that and how to increase them?
If I want to split the load into three. (Diff DTPs with Different selections). And all three will load the data into same info provider parallel. We have the routine in the selection that will look a table to get the respective selection conditions and all three DTPs will kick off parallel as part of the process chain.
But in some cases we only get the data for two or oneDTPs(depends on the selection conditions). In this case is there any way in routine or process chain to say that if there is no selection for that DTP then ignore that DTP or set to success for that DTP and process chain should continue.
Really appreciate your help.Hi
Sounds like a nice problemu2026
Here is a response to your questions:
Before I start, I just want to mention that I do not understand how you are bypassing the PSA if you are using a DTP? Be that as it may, I will respond regardless.
When looking at performance, you need to identify where your problem is.
First, execute your view directly on the database. Ask the DBA if you do not have access. If possible perform a database explain on the view (this can also be done from within SAPu2026I think). This step is required to ensure that the view is not the cause of your performance problem. If it is, we need to implement steps to resolve that.
If the view performs well, consider the following SAP BI ETL design changes:
1. Are you loading deltas or full loads. When you have performance problems u2013 the first thing to consider is to make use of the delta queue (or changing the extraction to just send deltas to BI)
2. Drop indexes before load and re-create them after the load
3. Make use of the BI 7.0 write optimized DSO. This allows for much faster loads.
4. Check if you do ABAP lookups during the load. If you do, consider loading the DSO that you are selecting on in memory and change the lookup to refer to the table in memory rather. This will save tremendous time in terms of DB I/O
5. This will have cost implications but the BI Accelerator will allow for much faster loads
Good luck! -
Aggregation while data loading
Hi All,
I need some help for understanding how I can load the following data :
I have following DB table structure:
Jan ABC Company 1 01/01/2011
Jan ABC Company 1 01/10/2011
Feb ABC Company 1 02/15/2011
Mar ABC Company 1 03/20/2011
When I am loading this data to Essbase, it aggregates Row 1 and row 2 , so only 3 records will be present in Essbase. one for jan , feb and mar each.
But since they have different dates i want them to be loaded separately.
Please advice how can I achieve this.
ThanksEssbase databases are typically used for summary level information not detail. If you want all of that detail I can give you two suggestions
1. Use another tool
2. Use Essbase studio to build your cube at a summary level and use drill-through to a relational source to allow the users to get to detail data to see what makes up the summary data.
(Note, I mention Essbase studio, but you could also use FDM or a thrid party tool like Dodeca)
Maybe you are looking for
-
Internal error occured...Reader XI
I've been getting the message "an internal error occurred" the last few times I've tried using Reader XI on my Windows 8.1 laptop. I've even gone as far as using Revo Uninstaller AND the tool at http://labs.adobe.com/downloads/acrobatcleaner.html w
-
Help!!! Itunes and Quicktime does not open
I already followed the suggestion here on the problems for ITunes and Quicktime suggestions to follow to go to Control panel but does not work and tried to uninstall and install back the itunes with quicktime but still no luck....I am deciding to ret
-
I was moved to BT mail from BT Yahoo Mail about 2 weeks ago. Since then I have been unable to use the BT Mail and have had to rely upon my Hotmail account and Incredimail to send & receive e-mail. When I click on the Email tab I get a page saying tha
-
Hi, i have a little problem (maybe)
Hi all, i bought a new mac-mini today, and i think that there is something strange, very often i can hear the sound of the processor (i think) like the pc, is a kind of bzbzbzbz and i know that for example the imac don t have this kind of sound, is i
-
Tunes from ipod classic to ipod touch
my laptop pouched and all my songs are on there question is i got new ipod touch how do i get my old songs that are on my ipod classic to my desktop so i can put same songs from ipod classic to the ipod touch?