Query on Delta
Hi,
Scenario
I am extracting delta loads from r/3 to bw. Suddenly i lost my delta records, already the delta records posting has happened , but i dint get them into BW and also i dint see those records in DeltaQueue
How do resolve this situation
Thanks
Hi
Do an Full repaire request which will get the missing records
First check with what are the company codes records u have missed?for that companycodes do an Full repaire..
Again u can do deltas .. as usual..
Note : dont forget to check mark the Repaire the full request at the
1) go to the IP and copy it
2) select the Full load
3) goto Scheduler in the menu and click on the repairefull request--> check mark the indicator
4) Scedule the load
regards
AK
Hope it helps
Similar Messages
-
Use Trigger or Query for Deltas
Hi Gurus,
It has come up that my boss wants us to use a trigger on one of the most critical talbes we use for our website. Instead I have suggested to query for deltas and feed them out on time intervals.....which method is the least performance impacting on our db?
Any info/suggestions would be much appreciated.
SHi,
ScarpacciOne wrote:
When is it a good idea to use triggers? What scenarios are a good fit to implement a trigger?DML triggers are great for autiomatically supplying or correcting data entry.
For example, supplying an id number from a sequence when a new row is inserted, or filling in last_modify_date and last_modify_user columns.
Sometimes you can autiomatically correct data entry errors. For example, if a string is supposed to be all upper-case, or a DATE column is supposed to be midnight. You can (and should) have CHECK constraints to make sure people don't enter bad data, but sometimes you can safely say "I'm sure he meant 'ACTIVE', even though he said 'active'", or "I'm sure she meant TURNC(SYSDATE), not SYSDATE", and avoid having any error.
DML triggers are the only way to ensure something is done the as soon as some DML is performed. If it can wait, even a few minutes, then some othe mechanism (like a scheduled job, or a materialized view) can probably handle the situation more easily and reliably. -
BI Query Statistic - Delta Update for 0TCT_C01.
Hi gurus,
I have installed certain Standard content objects (Infoproviders and respective queries). One of such infoprovider is 0TCT_C01. I donot have a process chain for this infoprovider, hence I load data every morning manually into this cube via a delta.
The number of records that come in with first delta run for the day is around 400 - 500. Yesterday, I ran this delta in the afternoon as well And I got 4 records in the update.
Now, this cube would have details as to how many times a particular query has been run, How many queries have been run for the day, what are the users etc.
My question is that when I ran the delta the first time in the morning, i got 450 odd records, but how come the count is so less when I ran it in the afternoon.
One thing I would mention is that there are a lot of users who use a lot many queries very frequently, so its would not be correct to say that only few queries have been run since morning.
Could anyone put some light on this.
Rgds,
Sree.
Edited by: Sree Nair on Oct 15, 2009 7:51 AMNo update,
So closing the thread -
Hi BW Experts,
For AP(Accounts payable),AR(Accounts receivable) we can run delta process to pick delta records. How?
Could anyone please let me know?
ThanksFI extractors are worked on after image delta. Delta records are diractly selected from the R/3 tables using a time stamp mechanism.delta records are diractly transfored to bw no need to writen to the bw Plz go through reg FI OFI_Gl_4,OFI_AP_4,OFI_AR_4 0FI_GL_4 (G/L Accounts: line items) No redundant fields are transferred into BW: Only fields from the FI document tables (BKPF/BSEG) that are relevant to general ledger accounting (compare table BSIS), No customer or vendor related fields. 0FI_AP_4 (AP: line items) and 0FI_AR_4 (AR: line items) Vendor / Customer related information (e.g. payment/dunning data). Coupled consistent snapshot of FI data in BW:extraction G/L account extraction determines selection criteria (comp.code, fiscal period) and upper time limit of all extracted FI line-items. AP and AR extraction: no further selection criteria necessary / possible. Uncoupled extraction possible with PlugIn PI 2002.2, see OSS note 551044. 0FI_GL_4, 0FI_AP_4, 0FI_AR_4 use an After Image Delta Delta type Extractor: Delta records are directly selected from the R/3 tables using a timestamp mechanism. Delta records are directly transferred to BW. No record is written to the BW delta queue. After Image Delta: FI line items are transferred from the source system in their final state (= After Image). This delta method is not suitable for direct InfoCube update. ODS object is obligatory to determine the delta for InfoCube update. Delta queue and BW scheduler ensure correct serialization of the records (e.g. inserts must not pass changes) Distribution of delta records to multiple BW systems. Selection criteria of Delta-Init upload are used to couple the datasources logically. time mechanism... New FI documents Posted in R/3 since the last line-item extraction. Selection based on the field BKPF-CPUDT. Hierarchy Extractor for Balance Sheet & P&L Structure Technical name: 0GLACCEXT_T011_HIER Technical Data Type of DataSource Hierarchies Application Component FI-IO Use The extractor is used for loading hierarchies (balance sheetl/P&L structures) for the characteristic (InfoObject) 0GLACCEXT. Fields of Origin in the Extract Structure Field in Extract Structure Description of Field in the Extract Structure Table of Origin Field in Table of Origin .INCLUDE ROSHIENODE RFDT CLUSTD FIELDNM RSFIELDNM RFDT CLUSTD GLACCEXT GLACCEXT RFDT CLUSTD RSIGN RR_RSIGN RR_PLUMI RFDT CLUSTD PLUMI ROSHIENODE RFDT CLUSTD Features of Extractor Extractor: FBIW_HIERARCHY_TRANSFER_TO Extraction structure: DTFIGL_HIERNODE_1 Financial Accounting: Line Item Extraction Procedure General Information About the Line Item Extraction Procedure BW Release 3.1 makes consistent data extraction in the delta method possible for line items in General Ledger Accounting (FI-GL), and selected subsidiary ledgers (Accounts Receivable FI-AR and Accounts Payable FI-AP) and tax reporting. The extraction procedure delivered with BW Release 2.0B, based on DataSources 0FI_AR_3 and 0FI_AP_3 , can be replaced. This is described in note 0410797. The decisive advantage of choosing R/3 line item tables as the data source is that extra fields can be transferred to the BW. These were not available with transaction figures from table GLTO, the previous R/3 data source in General Ledger Accounting FI-GL. This provides more extensive and flexible analysis in BW. To enable you to assure consistent Delta data, four new InfoSources are provided in the OLTP system (with the corresponding DataSources and extractors in the SAP R/3 system): Application InfoSource Description FI-GL 0FI_GL_4 General Ledger: Line Items FI-AP 0FI_AP_4 Accounts Payable: Line Items (Extraction Linked to 0FI_GL_4) FI-AR 0FI_AR_4 Accounts Receivable: Line Items (Extraction Linked to 0FI_GL_4) FI-TX 0FI_TAX_4 General Ledger: Data for Tax on Sales/Purchases For the General Ledger, selection is made from tables BKPF and BSEG, while selection for the subsidiary accounts is made from tables BSID/BSAD (Accounts Receivable) and BSIK/BSAK (Accounts Payable). InfoSource 0FI_GL_4 transfers only those fields that are relevant for General Ledger Accounting from the Financial Accounting document (tables BKPF and BSEG) to the BW system. The consisten recording of data from General Ledger Accounting and Subledger Accounting is provided by means of coupled delta extraction in the time stamp procedure. General ledger accounting is the main process in delta mode and provides subsidiary ledger extraction with time stamp information (time intervals of previously selected general ledger line items). This time stamp information can also be used for a loading history: this shows which line items have previously been extracted from the SAP R/3 system. Delta Method Delta extraction enables you to load into the BW system only that data that has been added or has changed since the last extraction event. Data that is already loaded and is not changed is retained. This data does not need to be deleted before a new upload. This procedure enables you to improve performance unlike the periodic extraction of the overall dataset. Financial Accounting line items are read by extractors directly from the tables in the SAP R/3 system. A time stamp on the line items serves to identify the status of the delta data. Time stamp intervals that have already been read are then stored in a time stamp table. The delta dataset is transferred to the BW system directly, without records being transferred to the delta queue in the SAP R/3 system (extraktor delta method). The Financial Accounting line items are extracted from the SAP R/3 system in their most recent status (after-image delta method). This data method is not suitable for filling InfoCubes directly in the BW system. To start with therefore, the line items must be loaded in the BW system in an ODS object that identifies the changes made to individual characteristics and key figures within a delta data record. Other data destinations (InfoCubes) can be provided with data from this ODS object. Time Stamp Method With Financial Accounting line items that have been posted in the SAP R/3 system since the last data request, the extractors identify the following delta dataset using the time stamp in the document header (BKPF-CPUDT). When a delta dataset has been selected successfully, the SAP R/3 system logs two time stamps that delimit a selection interval for a DataSource in table BWOM2_TIMEST: Field Name Key Description MANDT X Client OLTPSOURCE X DataSource AEDAT X SYSTEM: Date AETIM X SYSTEM: Time UPDMODE Data update mode (full, delta, deltainit) TS_LOW Lower limit of the time selection (time stamp in seconds since 1.1.1990) TS_HIGH Upper limit of the time selection (time stamp in seconds since 1.1.1990) LAST_TS Flag: 'X' = Last time stamp interval of the delta extraction TZONE Time zone DAYST Daylight saving time active? The time stamps are determined from the system date and time and converted to the format seconds since 1.1.1990, taking into account the time zone and daylight saving time. To ensure correct and unique reconversion to date and time, the time zone and daylight saving time must be stored in table BWOM2_TIMEST. Table BWOM2_TIMEST therefore serves to document the loading history of Financial Accounting line items. It also provides defined restart points following incorrect data requests. To provide a better overview, the time stamps in the example table are entered in the date format. The columns TZONE and DAYST were left out. OLTPSOURCE AEDAT/AETIM UPD DATE_LOW DATE_HIGH LAST_TS 0FI_GL_4 16 May 2000/20:15 Init 01 Jan. 1990 15 May 2000 0FI_GL_4 24 May 2000/16:59 Delta 16 May 2000 23 May 2000 0FI_GL_4 02 June 2000/21:45 Delta 24 June 2000 01 June 2000 0FI_GL_4 15 June 2000/12:34 Delta 02 June 2000 14 June 2000 0FI_GL_4 21 June 2000/18:12 Delta 15 June 2000 20 June 2000 X 0FI_AP_4 18 May 2000/21:23 Init 01 Jan. 1990 15 May 2000 0FI_AP_4 30 May 2000/12:48 Delta 16 May 2000 23 May 2000 0FI_AP_4 10 June 2000/13:19 Delta 24 June 2000 01 June 2000 X 0FI_AR_4 17 May 2000/18:45 Init 01 Jan. 1990 15 May 2000 0FI_AR_4 04 June 2000/13:32 Delta 16 May 2000 01 June 2000 0FI_AR_4 16 June 2000/15:41 Delta 02 June 2000 14 June 2000 X 0FI_TX_4 17 May 2000/18:45 Init 01 Jan. 1990 15 May 2000 0FI_TX_4 04 June 2000/13:32 Delta 16 May 2000 01 June 2000 0FI_TX_4 16 June 2000/15:41 Delta 02 June 2000 14 June 2000 X Constraints Per day, no more than one delta dataset can be transferred for InforSource 0FI_GL_4. The extracted data therefore has the status of the previous day. For further data requests on the same day, the InfoSource does not provide any data. In delta mode, data requests with InfoSource 0FI_AR_4 and InfoSource 0FI_AP_4 do not provide any data if no new extraction has taken place with InfoSource 0FI_GL_4 since the last data transfer. This ensures that the data in the BW system for Accounts Receivable and Accounts Payable Accounting is exactly as up to date as the data for General Ledger Accounting. If you delete the initialization selection in the source system for InfoSource 0FI_GL_4 in the BW system Administrator Workbench, the time stamp entries for InfoSources 0FI_GL_4, 0FI_AP_4, 0FI_AR_4 and OFI_TX_4 are also removed from table BWOM2_TIMEST. Recording Changed Line Items In the case of Financial Accounting line items that have been changed since the last data request in the SAP R/3 system, there is no reliable time stamp that can document the time of the last change. For this reason, all line items that are changed in a way relevant for BW must be logged in the SAP R/3 system
-
I have a problem when i create or assign a GENERIC DELTA for Transaction Data from Transaction RSO2 and generated the DELTA UPDATE.
Then, how do i see the data being updated : New Row or column (DELTA UPDATE) and in which transaction.
Any Pointer are appreciated.
- KshitijHi ,
Have done initialization for the datasource..coz' only after successfull initialization the entry will show up in RSA7
in case if you happen to see the entry,but under total filed you have zer0 LUW's that means no new data..you can cross check this with RSA3
Cheers,
Swapna.G
Message was edited by:
swapna gollakota -
Hi people , i have a problem.
In my society ther's a document that was revocated from r3. Now when i successively load the delta , on bw the document still be active. How i can do? Where is the error?
ThanksHi,
When you say revocation is it the status "Reason or rejection" being set for that sales order.
If you want that to reflect in BW then you should check the status of the field.This will defenately appear in the BW as a delta.
If it about cancellation then you will have a record with cacellation indicator with in the BW delta.If you are loading it in the ODS with records mode mapped to ROCANCEl then cancelled records will be delted automatically from the BW and you will not be able to see those records.You can see the records with flag "reason for rejection".
Hope it helps
Thanks -
Additive delta causing problem with data being displayed in the query.
Hi,
I am having a delta mechanism to load data from ODS to cube. The delta runs everynight and when we run the report for yesterday the report works fine but if some of the line items changed during the course of the day today the delta is picking them up today in the nightly job. When I run the report tomorrow for yesterday and today the report is displying incorrect data because the cube thinks there are more than one line items.
for eg: Yesterday let's say
Salesdocument customfield docitems Keyfeild
ODS : 1001 1 1 1
IC: 1001 1 1 1
When I run the query today and "customefield" is in the report it is displaying value of "1" which is correct.
Today let's say the key feild has been changed to 2. Since ODS is overwritable
ODS : 1001 1 1 2
but in IC, I see
IC: 1001 1 1 1
1001 1 1 -1
1001 1 1 2
and when I report on "Customefield" it displays the value of "3" instead of "1". Is there any way to solve this issue.
Very urgent.
Thanks,
PVPV,
Normally when the delta is loaded from an ods to cube, it will have two records, one will be the reverse image of the existing record and a new image of the record. So, in your case (I think you are referring to change in key figure and not the key field)
After the change, this is how it should look like --
ODS : 1001 1 1 2
IC: 1001 1 1 1 (original record)
1001 -1 -1 -1 (reverse image of the old record)
1001 1 1 2 (new image)
So when you run the report, you will get the following correct result.
1001 1 1 2
If this is not happening, then there must be something else. You can check all this by loading the delta to psa and looking for the particular record etc.
-Saket -
Delta buffer query in RSRT for BI-IP (" infoprovider /!!1 infoprovider )
Hello People,
I have an input ready query over an aggregation level of a real time cube. Whenever the yellow request is closed and a new request is opened, the input ready query does not show the old data. And sometimes it shows incorrect data. We found that the issue is with the Cache.
In RSRT when opening the input ready query in debug mode with "Do not use cache setting", the query returns correct data. But the surprise thing is that, the input ready query has Cache setting as inactive (0) in RSRT. So we had to generate the Delta buffer query <infoprovider>/!!1<infoprovider> in RSRT where <infoprovider> is the name of the real time cube.
This solved our problem and the query brought in correct data. But again when I close the second request, the input ready query again shows me no data or shows me wrong data. So again we need to generate the delta buffer query in RSRT <infoprovider>/!!1<infoprovider>.
This is very annoying when considering the fact that you have to generate the delta buffer query every time the request is closed. This could be a overhead in maintenance and will not go well with people.
Does anybody have any solutions for solving this issue. Is there any setting by which we can turn off cache altogether or delete cache when a request is closed etc? or worst conditions how to automate the generation of delta buffer queries every time the request is closed?
Any help is really appreciated.
Regards,
Dove.Hello Gregor,
Thank you very much.
I was preoccupied with the prerequisites of Note 1179076. It says that if you have already imported 15 and 16 and if you not have imported note 1101187, then no further action is necessary.
We are in support package 17 and not implemented 1101187, so I thought this note will not help us.
Am I wrong? Can you please throw some light??
Regards,
Arunan.C. -
SQL help - A query to find a delta
Hi All,
My case is:
create table table_test (sample_date date
,status varchar2(1)
,sum number);
insert into table_test (sysdate-1,'A',1);
insert into table_test (sysdate-1,'B',2);
insert into table_test (sysdate-1,'C',5);
insert into table_test (sysdate-1,'D',10);
insert into table_test (sysdate,'A',7);
insert into table_test (sysdate,'B',2);
insert into table_test (sysdate,'C',3);
I need a query that return the delta between today and yesterday:
status sum
A 6
C -2
D -10
Thanks!WITH t_today AS
(SELECT status, NVL (SUM, 0) SUM
FROM table_test
WHERE sample_date LIKE SYSDATE),
t_yesterday AS
(SELECT status, NVL (SUM, 0) SUM
FROM table_test
WHERE sample_date LIKE SYSDATE - 1)
SELECT t_yesterday.status, NVL (t_today.SUM, 0) - t_yesterday.SUM
FROM t_today RIGHT JOIN t_yesterday ON t_today.status = t_yesterday.status
S NVL(T_TODAY.SUM,0)-T_YESTERDAY.SUM
A 6
B 0
C -2
D -10Editted : to correct the query -
*Query needs The Delta changeed records from Change log table*
Hello Friends,
I got an requirement where they want to see the current data and also the changes that happened last week. For e.g.
Sales Order 1 100 Qty Nov 1 2009
sales Order 1 50 Qty Oct 25 2009
as you can see, they want to see these both records in the query. I am running out of ideas , on how to establish this scenario. Can we some how report directly on the changelog table where it maintains all the delta changes.....
Appreciate your answers........ Thanks for your time......Hi,
you can do the following:
to find last week
Create a Customer Variable for Calday
EXIT_SAPLRSOO_001 ---> Include ZXRSRUO1 write the code for the variable created above
DATA : date TYPE sy-datum.
date = sy-datum. "Today
date+6(2) = '01'. "First day of this month
date = date - 7. "Previous week before first day of this month = last week of last month
Current week take the system Date ( with give the current week)
OR (more options)
Some function modules you can use in exits are
DATE_GET_WEEK - Will return a week that the day is in. ( Current week)
DATE_GET_WEEK -1 will returne the previous week of the day is ( previous week)
Santosh -
Query Runtime Statistics - Delta Load Errors
Good Morning,
We recently upgraded our BI system to 7.0. One of the consultants who was here activated some standard business content, as the title of my subject shows, Query Runtime Statistics. The Infocube that is being loaded in the process chain he activated is: 0TCT_C01. This load utilizes 4 individual delta loads, coming from Datasources: 0TCT_DSA1, 0TCT_DSO1, 0TCT_DSO2, and 0TCT_DSO3.
The problem we're having is that at least 2 of these frequently fail ( 0TCT_DSO2 and 0TCT_DSO3 ).
The failure messages we get is that some of the data coming in for object 0TCTSTATOBJ (Query Runtime Object) is always bad. Some example values of the data that it is trying to load include:
FILTER(1)
TABELLE(1)
GENERISCHER NAVIGATIONSBLOCK(1
I'm not seeing anything in particular that would indicate these as being invalid values. The ()'s I wouldn't think would be causing it, given that those are in our system as acceptable characters (checked RSKC to verify) and there's nothing else that looks out of place (aside from the values coming in as German). Has anyone else come across an error like this for this particular statistics load? Manually changing these erroneous in the PSA is not an option given the frequency of the failures.
Any help would be greatly appreciated. If I need to clarify or relay any additional information, please let me know as well. ThanksSita,
I greatly appreciate your response. However, I probably should have provided some additional technical information regarding the issue.
The values I posted actually are from the PSA. These values that are given are telling the person what Java object the report being used, has had an issue (may not just be reporting issues, but general statistics for the use of these java objects, not sure yet).
I am aware that when BI gets an invalid character, it replaces it (in the display screen) with a #. Unfortunately, we do have # in our permitted characters and the reason why that doesn't work to qualify what was entered is because while it shows up in your display screen as a #, in the background it still sees it as the invalid char. Subsequently, you won't always be able to see what that invalid char is. We've encountered this many times over the years with our Sales Orders when a Rep (or someone within the company) enters invalid characters (typically they do a copy and paste from Excel using a different Font, or are using arabic or other foreign characters).
Given that the values we're recieving are in German, my only conclusion is that the words themselves contain german characters often used in certain letters. For example, the letter U can sometimes have what they call an umlout (spelling may be off on that). An umlout almost looks like a quotation mark that is placed directly above the letter itself. Another common letter in German is the ß symbol.
We may be able to bypass our problem by entering in these german characters, but I guess I was looking for more than just a "hack" solution and was curious if anyone else out there has had a similar issue on their system and if they were able to come up with a solution other than trying to guess and add in "special characters" that are not normally used in the english language.
Again, my appologies for not being more descriptive and thank you again for replying. -
Query related to Delta changes in ABAP Dictionary
Dear All,
Can anyone of you please provide me the necessary documents or can suggest the delta changes from 4.6C to ECC 6.0 for Data Dictionary.
Thanks and regards,
AtanuYou should be able to find the release documentation for ECC6.0 on the help.sap.com website. Will post a link if I come across it.
I doubt there would be too many (if any) changes to the data dictionary though.
Hope this helps.
Sudha -
Can you run deltas off of an Infoset query datasource?
Hey Gurus of BW,
We had a consultant come in and setup an infoset datasource in R/3. Then in BW set up that datasource as a feed into a cube using delta pulls. The delta init pull pulls all the data, but then the deltas never pull any data. Is this even possible?
Thanks, KenKen,
I think generic datasource created on Infoset.
Provide safety interval - lower limit as 1 for delta field(change date or creation date) & check.
Loading generic datasource data directly into cube(addition) may create a duplicate data. Due to safety interval every time it pulls data from last day to current.
Srini -
SQL Query to find the Delta between 2 rows
Can the below be possible? If so can you help me writing aSQL Query to do so…
I have data in spreadsheet which is pulled off from the database (data from more than one table with joins); send it to the different teams. They will check the data n update the spreadsheet if necessary and send it back to me.
I have to find the changes and update the database from the provided spreadsheet accordingly. Changes can be on different columns on each set of row.
Example:
DataFrom
ServerName
Branch_Name
Application_Name
Server Status
Application_Status
App_Environment
Tier
SQL Query
abcdef
app
adp
Deployed
Deployed
Production
silver
Excel
abcdef
app
adp
Deployed
Deployed
Development
Bronze
DataFrom
ServerName
Branch_Name
Application_Name
Server Status
Application_Status
App_Environment
Tier
SQL Query
Hijkl
app
adp
Deployed
Deployed
Production
Gold
Excel
Hijkl
app
Dep
Deployed
Deployed
Production
Gold
DataFrom
ServerName
Branch_Name
Application_Name
Server Status
Application_Status
App_Environment
Tier
SQL Query
Xzy
app
Dep
Deployed
Deployed
Production
Silver
Excel
Xzy
App
Dep
Deployed
Deployed
Development
Silver
Above scenario is an example what I am look to do with sql script? Opinions/queries accepted…
There are 1200+ rows to compare it manually which is a pain.
Thanks.Columns are different, when the contain multiple distinct values.
SELECT COUNT(DISTINCT Name) ,
COUNT(DISTINCT GroupName) ,
COUNT(*)
FROM HumanResources.Department;
Without a concise and complete example (table DDL and sample data insert statements), it's hard to tell what the correct solution could be..
DECLARE @Sample TABLE ( SetID INT, ServerID INT, ApplicationID INT );
INSERT INTO @Sample
VALUES ( 1, 1, 1 ),
( 1, 1, 1 ),
( 2, 1, 1 ),
( 2, 1, 2 ),
( 3, 1, 1 ),
( 3, 2, 1 );
WITH Evaluate AS
SELECT SetID,
COUNT(DISTINCT ServerID) AS Servers,
COUNT(DISTINCT ApplicationID) AS Applications
FROM @Sample
GROUP BY SetID
SELECT S.*,
CASE WHEN E.Servers != 1 THEN 1 ELSE 0 END AS ServersDifferent,
CASE WHEN E.Applications != 1 THEN 1 ELSE 0 END AS ApplicationsDifferent
FROM @Sample S
INNER JOIN Evaluate E ON S.SetID = E.SetID
ORDER BY S.SetID; -
i need to create a extractor x_attr to bring master data attributes from a table y.how can set
the generic delta on this x_attr generic datasource.does a generic delta exists for attr
datasource?Hi,
For delta to work properly in case of generic extractor.the field on which you want to make delta should be present in
the data source.
We have three delta.
1.Time stamp.
2.Cal day.
3Numeric pointer.
So depending upon your requirement and volume of your data you can choose the delta method.
Thanks,
Saveen Kumar
Maybe you are looking for
-
Voice Memo .mov files lost in iphone Limbo
Hello. After doing some research it appears that: There is a bug with iphone (mine is 3g) that when voice memos are recorded they are recorded as .mov files, and then converted to .m4a. I assume this is because .m4a is a more complex algorithm and th
-
I am trying to update iTunes and keep receiving the message "The folder path "My Music" contains an invalid character. The folder where everything is stored isnt even called My Music, it is just called Music. I gave up on trying to update my itunes a
-
Configuration of Public Key Authentication Policy for SFTP on OAG 11.1.2.2
Hi I'm working on the configuration of an SFTP server over OAG, using both password and public key authentication. This particular listener need 3 policies: - Password Authentication - Public Key Authentication - File upload Both File upload and pass
-
How do I get the email to show folders off hotmail?
I can't see the folders I have set up on my hotmail account, I can see them on my iPhone but not on my Mac Book Pro
-
Optical Out... Still Supported?
I used to use optical out on my MacBook all the time. I'm one of those people who still pulls out his MD player from time to time and I'm thinking of getting another MD deck. Anywaayyy... I haven't used the optical out for some time and now when I ho