The Case of the disappearing data
I don't know if this problem is unique to me & could well mean I have a defective 3G iPhone. One day about 2 weeks ago I went to listen to music on my iPhone & was greeted by the musical note (meaning no songs available). On looking under settings, general, about it showed zero songs, but still showed the free space available as when 840 songs were present. I performed a restore on the phone, but it said I had a corrupt backup. My only backup choice available was the sync from my previous 2G iPhone, which I selected & finished out the restore. All my songs were restored, but since then I have not been able to reconnect with my company's exchange server using the same settings as before. I was hoping the 2.1 update would solve my problems, but I am still unable to get the Exchange email to work. Today I received a text message from my wife & noticed it was just displaying her mobile # & not her name. I looked in contacts & every entry displayed the contact name only but non of the other info (address, phone numbers, etc.). I reset the iPhone & my contact info fortunately was restored via Mobile Me, but I still think I have issues. Has anyone experienced these kinds of issues with data? All total I have lost the songs on this twice & just experienced the contacts glitch today.
This has been reported before. Its a database corruption.
The missing music can usually be fixed without a restore.
Usually, buying one song from itunes store or adding one song thru the iTunes sync will fix things up.
Your contacts may have a similar problem, or it could be MobileMe. There have been lots of problems with Mobile Me, so much so that I just avoid it all together.
Similar Messages
-
Has the "reset data and documents" disappeared
Has the "reset data and documents" disappeared from the advanced settings on icloud.com after signing in?
Yes.
-
Survey questionaire disappearing when updating the start date and end date to previous year
Hi Experts
When the start and end dates are updated to the previous year dates the questionaire is disappearing from the assignment block, is this a standard behaviour or this is a bug which needs to be raised to SAP.
Please advise.
Appreciate for a quick response as this is quiet urgent.
Regards
AMHi AM,
Check the SAP Note 1629777 - SVY: Survey becomes invalid if the from date < sysdate
This should resolve the issue.
Regards,
Gervase -
The cellular data services have disappeared from my settings. Can anyone help me restore it please?
Try this - Reset the iPad by holding down on the Sleep and Home buttons at the same time for about 10-15 seconds until the Apple Logo appears - ignore the red slider - let go of the buttons. (This is equivalent to rebooting your computer.) No data/files will be erased. http://support.apple.com/kb/ht1430
Cheers, Tom -
The Case of the Disappearing Documents
The first time this happened to me, I figured I was just
imagining things. Now that's it's happened to me a second time, I
am certain that my imagination has nothing to do with it.
I am working in Captivate 2 in an XP (SP2) environment. I've
completed about a dozen projects successfully using Captivate 1 and
am in the midst of creating my first project in version 2.
I've created a Captivate 2 document with less than a dozen
slides. I experience the disappearing menus problem quite
frequently -- perhaps every half hour or so while using the program
(I haven't tried the clicking on the library item trick yet, but
will the next time it happens). Typically I click on the Save
button and quit out of the program whenever it happens. One time
when I did that, the CP file went missing from the folder it was
saved in. Completely missing.
The file icon itself was not in the Folder. The file name was
no longer listed on the launch window of the program, nor was it
listed in the File menu as a recently opened file. I know I am not
crazy and figured the file had to be somewhere on my hard drive, so
I started searching for files of a certain size which were updated
in the last day. I actually located the file, which had been
renamed renamed with a .tmp extension (and gobbledygook before the
extension). I only recognized the file because of its size and when
I hovered my mouse pointer over it, the document's name from the
Properties dialog box showed up in a tool tip. I was able to open
the file, rename it, and resume using it.
I figured it was just a fluke...but now it's happened a
second time, albeit slightly different circumstances. The program
froze and BAM! the file completely disappeared. Again, I searched
my hard drive and was able to locate the file.
So now, I save a copy of my file about every half-hour in a
separate folder...because I cannot afford to lose more than a
half-hours' work. I am very worried about the stability of this
product. Does anybody have any ideas regarding this issue?I don't know if version 3 will be any better, but I have my
doubts. I am totally embarrassed about my failure to be able to
complete this project on time for my client due to all of these
issues. As a freelancer, the ability to make my clients' deadlines
and provide a high-quality end product is crucial to my continued
success. I am furious with Adobe for this product's instability.
quote:
Originally posted by:
paddie_ooo
Hi geekygURL:
Several folks on our team have had the same issue. The files
simply will not open... or the file appears but shows as 0 slides
and 0kb in size. We're also working from c:// ... with 400G+ of
space... and using std naming convention for files (no special
characters etc).
Unfortunatly, missed deadlines and extra non-billing hours
will result in staffing issues and different purchasing decisions
in the next fiscal year. Some folks are saying ... why did Adobe
drop the "R" when they released Captivate2.
Is Captivate3 really going to be any better? If so... how
about a free upgrade for all of use who suffered through
Captivate2? It's really tough to justify the expense of Captivate3
to our bosses when all they have seen are the problems with
Captiavte2. -
BEX to allow for the dynamic date on a variant as is the case with ABAP
Hi All,
I have doubt that where we are able to use dynamic date for BEX date variable?
In normal ABAP program,We have option to get dynamic date for date field parameters.Like that can we have dynamic date for BEX date variable.
Please let me know.
Thanks in advance for your help,
Bandana.
Edited by: Bandana Baghel on Jul 3, 2009 10:44 AMHi Bandana,
When you say dynamic date, are you refering to the system date on which the BEx query run?
If so, there are SAP Exit variable available which can be used to retrieve system date at run time.
Try using 0DAT variable which will return the system date at run time.
Regards
Amit Roy -
Why did the cellular data tab disappear
I am not able to access internet. My cellular button is no where to be found
Hi DELTADAWN545,
Welcome to the Apple Support Communities!
I understand that you are unable to locate the cellular data settings on your iPad. the cellular data settings should be found under Settings > Cellular Data as described in the attached article. If this option is not there and you are sure this is an iPad Wi-Fi plus Cellular model, please let me know.
Learn about cellular data settings and usage on your iPhone and iPad (Cellular Model) - Apple Support
Best regards,
Joe -
How do I remove the photo dates other than with Sport Removal?
How can I remove the photo dates that appear in yellow from a photo other than using the Spot Removal tool which has not worked?
While I agree that more detail is necessary from the original poster ... maybe even a screen capture
I don't know if this is what the original poster means, but when I have tried to use the Lightroom 5 Enhanced Spot Removal Tool for this purpose, I could never obtain a solution that seemed to remove the offending object and fill in the desired area with a relatively continuous and natural looking result. It always seemed clear to me, when I tried to do this, that the result always had un-natural looking discontinuities in the end result caused by my application of the Enhanced Sport Removal Tool.
And so, in my case, to eliminate the dates from the bottom of my photos, I wound up using different software ... in my case Photoshop Elements, which produced a more satisfactory result.
On the left is the version of the photo created by Photoshop Elements, the date is missing (and at the bottom right, a lamp to illuminate the sign is also missing); while on the right is the image with no attempt to remove the date and lamp. My attempts to reproduce the removal of this date via Lightroom were unsuccessful, the grass never looked natural, the grass always had discontinuities which made the result unsatisfactory, and I did not save the Lightroom attempts to remove the date and lamp. (And if you look carefully on the left, you can see places where the shadows abrupty have disappeared, an artifact of the process, however I still deemed the photo on the left to be successful at the removal of these items, as I don't usually focus on the shadows) -
My apple tv gets stuck and says it need to set the date and time on the network. Does anyone know how to do this?
I was able to get my Apple TV working again by downgrading the firmware following the instructions in this message:
https://discussions.apple.com/message/20008613#20008613
In case that message disappears, here's my translation:
1. Connect your Apple TV to your computer via USB. Don't plug in the power cable.
2. Download the older 5.0.2 firmware from Apple here:
5.0.2 (Apple TV 2): AppleTV2,1_5.0.2_9B830_Restore.ipsw
5.0.2 (Apple TV 3): AppleTV3,1_5.0.2_9B830_Restore.ipsw
3. Open iTunes and select the Apple TV. Hold down the option key and press the restore button at the same time.
4. In the dialog that opens select the firmware that you downloaded.
5. The restore process will take a few minutes. Once it's done plug your Apple TV back into your TV. If the date and time message persists just press the menu button to skip it. This got mine working again. -
CBO generating different plans for the same data in similar Environments
Hi All
I have been trying to compare an SQL from 2 different but similar environments build of the same hardware specs .The issue I am facing is environment A, the query executes in less than 2 minutes with plan mostly showing full table scans and hash join whereas in environment B(problematic), it times out after 2 hours with an error of unable to extend table space . The statistics are up to date in both environments for both tables and indexes . System parameters are exactly similar(default oracle for except for multiblock_read_count ).
Both Environment have same db parameter for db_file_multiblock_read_count(16), optimizer(refer below),hash_area_size (131072),pga_aggregate_target(1G),db_block_size(8192) etc . SREADTIM, MREADTIM, CPUSPEED, MBRC are all null in aux_stats in both environment because workload was never collected i believe.
Attached is details about the SQL with table stats, SQL and index stats my main concern is CBO generating different plans for the similar data and statistics and same hardware and software specs. Is there any thing else I should consider .I generally see environment B being very slow and always plans tend to nested loops and index scan whereas what we really need is a sensible FTS in many cases. One of the very surprising thing is METER_CONFIG_HEADER below which has just 80 blocks of data is being asked for index scan.
show parameter optimizer
optimizer_dynamic_sampling integer 2
optimizer_features_enable string 10.2.0.4
optimizer_index_caching integer 0
optimizer_index_cost_adj integer 100
optimizer_mode string ALL_ROWS
optimizer_secure_view_merging boolean TRUE
**Environment**
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
PL/SQL Release 10.2.0.4.0 - Production
CORE 10.2.0.4.0 Production
TNS for Solaris: Version 10.2.0.4.0 - Production
NLSRTL Version 10.2.0.4.0 - Production
Note: : There are slight difference in the no of records in the attached sheet.However, I wanted to tell that i have tested with exact same data and was getting similar results but I couldn't retain the data untill collecting the details in the attachment
TEST1 COMPARE TABLE LEVE STATS used by CBO
ENVIRONMENT A
TABLE_NAME NUM_ROWS BLOCKS LAST_ANALYZED
ASSET 3607425 167760 5/02/2013 22:11
METER_CONFIG_HEADER 3658 80 5/01/2013 0:07
METER_CONFIG_ITEM 32310 496 5/01/2013 0:07
NMI 1899024 33557 18/02/2013 10:55
REGISTER 4830153 101504 18/02/2013 9:57
SDP_LOGICAL_ASSET 1607456 19137 18/02/2013 15:48
SDP_LOGICAL_REGISTER 5110781 78691 18/02/2013 9:56
SERVICE_DELIVERY_POINT 1425890 42468 18/02/2013 13:54
ENVIRONMENT B
TABLE_NAME NUM_ROWS BLOCKS LAST_ANALYZED
ASSET 4133939 198570 16/02/2013 10:02
METER_CONFIG_HEADER 3779 80 16/02/2013 10:55
METER_CONFIG_ITEM 33720 510 16/02/2013 10:55
NMI 1969000 33113 16/02/2013 10:58
REGISTER 5837874 120104 16/02/2013 11:05
SDP_LOGICAL_ASSET 1788152 22325 16/02/2013 11:06
SDP_LOGICAL_REGISTER 6101934 91088 16/02/2013 11:07
SERVICE_DELIVERY_POINT 1447589 43804 16/02/2013 11:11
TEST ITEM 2 COMPARE INDEX STATS used by CBO
ENVIRONMENT A
TABLE_NAME INDEX_NAME UNIQUENESS BLEVEL LEAF_BLOCKS DISTINCT_KEYS AVG_LEAF_BLOCKS_PER_KEY AVG_DATA_BLOCKS_PER_KEY CLUSTERING_FACTOR NUM_ROWS
ASSET IDX_AST_DEVICE_CATEGORY_SK NONUNIQUE 2 9878 67 147 12982 869801 3553095
ASSET IDX_A_SAPINTLOGDEV_SK NONUNIQUE 2 7291 2747 2 639 1755977 3597916
ASSET SYS_C00102592 UNIQUE 2 12488 3733831 1 1 3726639 3733831
METER_CONFIG_HEADER SYS_C0092052 UNIQUE 1 12 3670 1 1 3590 3670
METER_CONFIG_ITEM SYS_C0092074 UNIQUE 1 104 32310 1 1 32132 32310
NMI IDX_NMI_ID NONUNIQUE 2 6298 844853 1 2 1964769 1965029
NMI IDX_NMI_ID_NK NONUNIQUE 2 6701 1923072 1 1 1922831 1923084
NMI IDX_NMI_STATS NONUNIQUE 1 106 4 26 52 211 211
REGISTER REG_EFFECTIVE_DTM NONUNIQUE 2 12498 795 15 2899 2304831 4711808
REGISTER SYS_C00102653 UNIQUE 2 16942 5065660 1 1 5056855 5065660
SDP_LOGICAL_ASSET IDX_SLA_SAPINTLOGDEV_SK NONUNIQUE 2 3667 1607968 1 1 1607689 1607982
SDP_LOGICAL_ASSET IDX_SLA_SDP_SK NONUNIQUE 2 3811 668727 1 2 1606204 1607982
SDP_LOGICAL_ASSET SYS_C00102665 UNIQUE 2 5116 1529606 1 1 1528136 1529606
SDP_LOGICAL_REGISTER SYS_C00102677 UNIQUE 2 17370 5193638 1 1 5193623 5193638
SERVICE_DELIVERY_POINT IDX_SDP_NMI_SK NONUNIQUE 2 4406 676523 1 2 1423247 1425890
SERVICE_DELIVERY_POINT IDX_SDP_SAP_INT_NMI_SK NONUNIQUE 2 7374 676523 1 2 1458238 1461108
SERVICE_DELIVERY_POINT SYS_C00102687 UNIQUE 2 4737 1416207 1 1 1415022 1416207
ENVIRONMENT B
TABLE_NAME INDEX_NAME UNIQUENESS BLEVEL LEAF_BLOCKS DISTINCT_KEYS AVG_LEAF_BLOCKS_PER_KEY AVG_DATA_BLOCKS_PER_KEY CLUSTERING_FACTOR NUM_ROWS
ASSET IDX_AST_DEVICE_CATEGORY_SK NONUNIQUE 2 8606 121 71 16428 1987833 4162257
ASSET IDX_A_SAPINTLOGDEV_SK NONUNIQUE 2 8432 1780146 1 1 2048170 4162257
ASSET SYS_C00116157 UNIQUE 2 13597 4162263 1 1 4158759 4162263
METER_CONFIG_HEADER SYS_C00116570 UNIQUE 1 12 3779 1 1 3734 3779
METER_CONFIG_ITEM SYS_C00116592 UNIQUE 1 107 33720 1 1 33459 33720
NMI IDX_NMI_ID NONUNIQUE 2 6319 683370 1 2 1970460 1971313
NMI IDX_NMI_ID_NK NONUNIQUE 2 6597 1971293 1 1 1970771 1971313
NMI IDX_NMI_STATS NONUNIQUE 1 98 48 2 4 196 196
REGISTER REG_EFFECTIVE_DTM NONUNIQUE 2 15615 1273 12 2109 2685924 5886582
REGISTER SYS_C00116748 UNIQUE 2 19533 5886582 1 1 5845565 5886582
SDP_LOGICAL_ASSET IDX_SLA_SAPINTLOGDEV_SK NONUNIQUE 2 4111 1795084 1 1 1758441 1795130
SDP_LOGICAL_ASSET IDX_SLA_SDP_SK NONUNIQUE 2 4003 674249 1 2 1787987 1795130
SDP_LOGICAL_ASSET SYS_C004520 UNIQUE 2 5864 1795130 1 1 1782147 1795130
SDP_LOGICAL_REGISTER SYS_C004539 UNIQUE 2 20413 6152850 1 1 6073059 6152850
SERVICE_DELIVERY_POINT IDX_SDP_NMI_SK NONUNIQUE 2 3227 660649 1 2 1422572 1447803
SERVICE_DELIVERY_POINT IDX_SDP_SAP_INT_NMI_SK NONUNIQUE 2 6399 646257 1 2 1346948 1349993
SERVICE_DELIVERY_POINT SYS_C00128706 UNIQUE 2 4643 1447946 1 1 1442796 1447946
TEST ITEM 3 COMPARE PLANS
ENVIRONMENT A
Plan hash value: 4109575732
| Id | Operation | Name | Rows | Bytes |TempSpc| Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 13 | 2067 | | 135K (2)| 00:27:05 |
| 1 | HASH UNIQUE | | 13 | 2067 | | 135K (2)| 00:27:05 |
|* 2 | HASH JOIN | | 13 | 2067 | | 135K (2)| 00:27:05 |
|* 3 | HASH JOIN | | 6 | 900 | | 135K (2)| 00:27:04 |
|* 4 | HASH JOIN ANTI | | 1 | 137 | | 135K (2)| 00:27:03 |
|* 5 | TABLE ACCESS BY INDEX ROWID| NMI | 1 | 22 | | 5 (0)| 00:00:01 |
| 6 | NESTED LOOPS | | 1 | 131 | | 95137 (2)| 00:19:02 |
|* 7 | HASH JOIN | | 1 | 109 | | 95132 (2)| 00:19:02 |
|* 8 | TABLE ACCESS FULL | ASSET | 36074 | 1021K| | 38553 (2)| 00:07:43 |
|* 9 | HASH JOIN | | 90361 | 7059K| 4040K| 56578 (2)| 00:11:19 |
|* 10 | HASH JOIN | | 52977 | 3414K| 2248K| 50654 (2)| 00:10:08 |
|* 11 | HASH JOIN | | 39674 | 1782K| | 40101 (2)| 00:08:02 |
|* 12 | TABLE ACCESS FULL | REGISTER | 39439 | 1232K| | 22584 (2)| 00:04:32 |
|* 13 | TABLE ACCESS FULL | SDP_LOGICAL_REGISTER | 4206K| 56M| | 17490 (2)| 00:03:30 |
|* 14 | TABLE ACCESS FULL | SERVICE_DELIVERY_POINT | 675K| 12M| | 9412 (2)| 00:01:53 |
|* 15 | TABLE ACCESS FULL | SDP_LOGICAL_ASSET | 1178K| 15M| | 4262 (2)| 00:00:52 |
|* 16 | INDEX RANGE SCAN | IDX_NMI_ID_NK | 2 | | | 2 (0)| 00:00:01 |
| 17 | VIEW | | 39674 | 232K| | 40101 (2)| 00:08:02 |
|* 18 | HASH JOIN | | 39674 | 1046K| | 40101 (2)| 00:08:02 |
|* 19 | TABLE ACCESS FULL | REGISTER | 39439 | 500K| | 22584 (2)| 00:04:32 |
|* 20 | TABLE ACCESS FULL | SDP_LOGICAL_REGISTER | 4206K| 56M| | 17490 (2)| 00:03:30 |
|* 21 | TABLE ACCESS FULL | METER_CONFIG_HEADER | 3658 | 47554 | | 19 (0)| 00:00:01 |
|* 22 | TABLE ACCESS FULL | METER_CONFIG_ITEM | 7590 | 68310 | | 112 (2)| 00:00:02 |
Predicate Information (identified by operation id):
2 - access("METER_CONFIG_HEADER_SK"="METER_CONFIG_HEADER_SK")
3 - access("NETWORK_TARIFF_CD"="NETWORK_TARIFF_CD")
4 - access("SERVICE_DELIVERY_POINT_SK"="TMP"."SERVICE_DELIVERY_POINT_SK")
5 - filter("ROW_CURRENT_IND"='Y' AND ("NMI_STATUS_CD"='A' OR "NMI_STATUS_CD"='D'))
7 - access("ASSET_CD"="EQUIP_CD" AND "SAP_INT_LOG_DEVICE_SK"="SAP_INT_LOG_DEVICE_SK")
8 - filter("ROW_CURRENT_IND"='Y')
9 - access("SERVICE_DELIVERY_POINT_SK"="SERVICE_DELIVERY_POINT_SK")
10 - access("SERVICE_DELIVERY_POINT_SK"="SERVICE_DELIVERY_POINT_SK")
11 - access("SAP_INT_LOGICAL_REGISTER_SK"="SAP_INT_LOGICAL_REGISTER_SK")
12 - filter("REGISTER_TYPE_CD"='C' AND (SUBSTR("REGISTER_ID_CD",1,1)='4' OR
SUBSTR("REGISTER_ID_CD",1,1)='5' OR SUBSTR("REGISTER_ID_CD",1,1)='6') AND "ROW_CURRENT_IND"='Y')
13 - filter("ROW_CURRENT_IND"='Y')
14 - filter("ROW_CURRENT_IND"='Y')
15 - filter("ROW_CURRENT_IND"='Y')
16 - access("NMI_SK"="NMI_SK")
18 - access("SAP_INT_LOGICAL_REGISTER_SK"="SAP_INT_LOGICAL_REGISTER_SK")
19 - filter("REGISTER_TYPE_CD"='C' AND (SUBSTR("REGISTER_ID_CD",1,1)='1' OR
SUBSTR("REGISTER_ID_CD",1,1)='2' OR SUBSTR("REGISTER_ID_CD",1,1)='3') AND "ROW_CURRENT_IND"='Y')
20 - filter("ROW_CURRENT_IND"='Y')
21 - filter("ROW_CURRENT_IND"='Y')
22 - filter("ROW_CURRENT_IND"='Y' AND "CONROL_REGISTER"='X')
ENVIRONMENT B
Plan hash value: 2826260434
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 1 | 181 | 103K (2)| 00:20:47 |
| 1 | HASH UNIQUE | | 1 | 181 | 103K (2)| 00:20:47 |
|* 2 | HASH JOIN ANTI | | 1 | 181 | 103K (2)| 00:20:47 |
|* 3 | HASH JOIN | | 1 | 176 | 56855 (2)| 00:11:23 |
|* 4 | HASH JOIN | | 1 | 163 | 36577 (2)| 00:07:19 |
|* 5 | TABLE ACCESS BY INDEX ROWID | ASSET | 1 | 44 | 4 (0)| 00:00:01 |
| 6 | NESTED LOOPS | | 1 | 131 | 9834 (2)| 00:01:59 |
| 7 | NESTED LOOPS | | 1 | 87 | 9830 (2)| 00:01:58 |
| 8 | NESTED LOOPS | | 1 | 74 | 9825 (2)| 00:01:58 |
|* 9 | HASH JOIN | | 1 | 52 | 9820 (2)| 00:01:58 |
|* 10 | TABLE ACCESS BY INDEX ROWID| METER_CONFIG_HEADER | 1 | 14 | 1 (0)| 00:00:01 |
| 11 | NESTED LOOPS | | 1 | 33 | 116 (2)| 00:00:02 |
|* 12 | TABLE ACCESS FULL | METER_CONFIG_ITEM | 1 | 19 | 115 (2)| 00:00:02 |
|* 13 | INDEX RANGE SCAN | SYS_C00116570 | 1 | | 1 (0)| 00:00:01 |
|* 14 | TABLE ACCESS FULL | SERVICE_DELIVERY_POINT | 723K| 13M| 9699 (2)| 00:01:57 |
|* 15 | TABLE ACCESS BY INDEX ROWID | NMI | 1 | 22 | 5 (0)| 00:00:01 |
|* 16 | INDEX RANGE SCAN | IDX_NMI_ID_NK | 2 | | 2 (0)| 00:00:01 |
|* 17 | TABLE ACCESS BY INDEX ROWID | SDP_LOGICAL_ASSET | 1 | 13 | 5 (0)| 00:00:01 |
|* 18 | INDEX RANGE SCAN | IDX_SLA_SDP_SK | 2 | | 2 (0)| 00:00:01 |
|* 19 | INDEX RANGE SCAN | IDX_A_SAPINTLOGDEV_SK | 2 | | 2 (0)| 00:00:01 |
|* 20 | TABLE ACCESS FULL | REGISTER | 76113 | 2378K| 26743 (2)| 00:05:21 |
|* 21 | TABLE ACCESS FULL | SDP_LOGICAL_REGISTER | 5095K| 63M| 20245 (2)| 00:04:03 |
| 22 | VIEW | | 90889 | 443K| 47021 (2)| 00:09:25 |
|* 23 | HASH JOIN | | 90889 | 2307K| 47021 (2)| 00:09:25 |
|* 24 | TABLE ACCESS FULL | REGISTER | 76113 | 966K| 26743 (2)| 00:05:21 |
|* 25 | TABLE ACCESS FULL | SDP_LOGICAL_REGISTER | 5095K| 63M| 20245 (2)| 00:04:03 |
Predicate Information (identified by operation id):
2 - access("SERVICE_DELIVERY_POINT_SK"="TMP"."SERVICE_DELIVERY_POINT_SK")
3 - access("SERVICE_DELIVERY_POINT_SK"="SERVICE_DELIVERY_POINT_SK" AND
"SAP_INT_LOGICAL_REGISTER_SK"="SAP_INT_LOGICAL_REGISTER_SK")
4 - access("ASSET_CD"="EQUIP_CD")
5 - filter("ROW_CURRENT_IND"='Y')
9 - access("NETWORK_TARIFF_CD"="NETWORK_TARIFF_CD")
10 - filter("ROW_CURRENT_IND"='Y')
12 - filter("ROW_CURRENT_IND"='Y' AND "CONROL_REGISTER"='X')
13 - access("METER_CONFIG_HEADER_SK"="METER_CONFIG_HEADER_SK")
14 - filter("ROW_CURRENT_IND"='Y')
15 - filter("ROW_CURRENT_IND"='Y' AND ("NMI_STATUS_CD"='A' OR "NMI_STATUS_CD"='D'))
16 - access("NMI_SK"="NMI_SK")
17 - filter("ROW_CURRENT_IND"='Y')
18 - access("SERVICE_DELIVERY_POINT_SK"="SERVICE_DELIVERY_POINT_SK")
19 - access("SAP_INT_LOG_DEVICE_SK"="SAP_INT_LOG_DEVICE_SK")
20 - filter((SUBSTR("REGISTER_ID_CD",1,1)='4' OR SUBSTR("REGISTER_ID_CD",1,1)='5' OR
SUBSTR("REGISTER_ID_CD",1,1)='6') AND "REGISTER_TYPE_CD"='C' AND "ROW_CURRENT_IND"='Y')
21 - filter("ROW_CURRENT_IND"='Y')
23 - access("SAP_INT_LOGICAL_REGISTER_SK"="SAP_INT_LOGICAL_REGISTER_SK")
24 - filter((SUBSTR("REGISTER_ID_CD",1,1)='1' OR SUBSTR("REGISTER_ID_CD",1,1)='2' OR
SUBSTR("REGISTER_ID_CD",1,1)='3') AND "REGISTER_TYPE_CD"='C' AND "ROW_CURRENT_IND"='Y')
25 - filter("ROW_CURRENT_IND"='Y')Edited by: abhilash173 on Feb 24, 2013 9:16 PM
Edited by: abhilash173 on Feb 24, 2013 9:18 PMHi Paul,
I misread your question initially .The system stats are outdated in both ( same result as seen from aux_stats) .I am not a DBA and do not have access to gather system stats fresh.
select * from sys.aux_stats$
SNAME PNAME PVAL1 PVAL2
SYSSTATS_INFO STATUS NULL COMPLETED
SYSSTATS_INFO DSTART NULL 02-16-2011 15:24
SYSSTATS_INFO DSTOP NULL 02-16-2011 15:24
SYSSTATS_INFO FLAGS 1 NULL
SYSSTATS_MAIN CPUSPEEDNW 1321.20523 NULL
SYSSTATS_MAIN IOSEEKTIM 10 NULL
SYSSTATS_MAIN IOTFRSPEED 4096 NULL
SYSSTATS_MAIN SREADTIM NULL NULL
SYSSTATS_MAIN MREADTIM NULL NULL
SYSSTATS_MAIN CPUSPEED NULL NULL
SYSSTATS_MAIN MBRC NULL NULL
SYSSTATS_MAIN MAXTHR NULL NULL
SYSSTATS_MAIN SLAVETHR NULL NULL -
Enhancing the standard Data source without deleting setup tables
Hi all,
I am in the the Support Project. My requirement is I want to Enhance u201C2LIS_13_VDITMu201D LO- data source with two fields without disturbing the delta.
Please suggest me how I have to do this.
As Per my Knowledge ,
1. we have to delete setup tables
2. Enhance the data source & re populate the setup tables.
3. Delete the data in the cube & add the two new fields in the cube & repopulate the cube with new Initial .
4. after that delta will be enabled through job control.
But this process is not suitable for our requirement because delta was enabled long back it is going very smooth till date, I donu2019t want to disturb that process.
So please suggest me is there any other procedure to do this.
Thanks,
Kiran ManyamHi,
If historical data (loaded earlier in to BW) are not required for the two enhance field, then it is not required to deleted the setup table and reload them to BI.
In this case simply you can follow the following procedure.
1. Enhance the fields, and update the transfor structure(to unhide these fields). In BI update the required data target with respective IO. and in exit populate the enhance fields. No need to disturb the delta
2. Replicated the DS in BI and do the mappings in tranformation.
Here the existing delta is working, and you will be populating the two fields in the exit only.
Thanks,
Jugal. -
Hello there,
I have an Excel report I created which works perfectly fine on my dev environment, but fails on my test environment when I try to do a data refresh.
The key difference between both dev and test environments is that in dev, everything is installed in one server:
SharePoint 2013
SQL 2012: Database Instance, SSAS Instance, SSRS for SharePoint, SSAS POWERPIVOT instance (Powerpivot for SharePoint).
In my test and production environments, the architecture is different:
SQL DB Servers in High Availability (irrelevant for this report since it is connecting to the tabular model, just FYI)
SQL SSAS Tabular server (contains a tabular model that processes data from the SQL DBs).
2x SharePoint Application Servers (we installed both SSRS and PowerPivot for SharePoint on these servers)
2x SharePoint FrontEnd Servers (contain the SSRS and PowerPivot add-ins).
Now in dev, test and production, I can run PowerPivot reports that have been created in SharePoint without any issues. Those reports can access the SSAS Tabular model without any issues, and perform data refresh and OLAP functions (slicing, dicing, etc).
The problem is with Excel reports (i.e. .xlsx files) uploaded to SharePoint. While I can open them, I am having a hard time performing a data refresh. The error I get is:
"An error occurred during an attempt to establish a connection to the external data source [...]"
I ran SQL Profiler on my SSAS Server where the Tabular instance is and I noticed that every time I try to perform a data refresh, I get the following entries:
Every time I try to perform a data refresh, two entries under the user name ANONYMOUS LOGON.
Since things work without any issues on my single-server dev environment, I tried running SQL Server Profiler there as well to see what I get.
As you can see from the above, in the dev environment the query runs without any issues and the user name logged is in fact my username from the dev environment domain. I also have a separated user for the test domain, and another for the production domain.
Now upon some preliminary investigation I believe this has something to do with the data connection settings in Excel and the usage (or no usage) of secure store. This is what I can vouch for so far:
Library containing reports is configured as trusted in SharePoint Central Admin.
Library containing data connections is configured as trusted in SharePoint Central Admin.
The Data Provider referenced in the Excel report (MSOLAP.5) is configured as trusted in SharePoint Central Admin.
In the Excel report, the Excel Services authentication settings is set as "use authenticated user's account". This wortks fine in the DEV environment.
Concerning SecureStore, PowerPivot Configurator has configured it the PowerPivotUnnattendedAccount application ID in all the environments. There is
NO configuration of an Application ID for Excel Services in any of the environments (Dev, test or production). Altough I reckon this is where the solution lies, I am not 100% sure as to why it fails in test and prod. But as I read what I am
writing, I reckon this is because of the authentication "hops" through servers. Am I right in my assumption?
Could someone please advise what am I doing wrong in this case? If it is the fact that I am missing an Secure Store entry for Excel Services, I am wondering if someone could advise me on how to set ip up? My confusion is around the "Target Application
Type" setting.
Thank you for your time.
Regards,
P.Hi Rameshwar,
PowerPivot workbooks contain embedded data connections. To support workbook interaction through slicers and filters, Excel Services must be configured to allow external data access through embedded connection information. External data access is required
for retrieving PowerPivot data that is loaded on PowerPivot servers in the farm. Please refer to the steps below to solve this issue:
In Central Administration, in Application Management, click Manage service applications.
Click Excel Services Application.
Click Trusted File Location.
Click http:// or the location you want to configure.
In External Data, in Allow External Data, click Trusted data connection libraries and embedded.
Click OK.
For more information, please see:
Create a trusted location for PowerPivot sites in Central Administration:
http://msdn.microsoft.com/en-us/library/ee637428.aspx
Another reason is Excel Services returns this error when you query PowerPivot data in an Excel workbook that is published to SharePoint, and the SharePoint environment does not have a PowerPivot for SharePoint server, or the SQL Server Analysis
Services (PowerPivot) service is stopped. Please check this document:
http://technet.microsoft.com/en-us/library/ff487858(v=sql.110).aspx
Finally, here is a good article regarding how to troubleshoot PowerPivot data refresh for your reference. Please see:
Troubleshooting PowerPivot Data Refresh:
http://social.technet.microsoft.com/wiki/contents/articles/3870.troubleshooting-powerpivot-data-refresh.aspx
Hope this helps.
Elvis Long
TechNet Community Support -
How can I reference records outside the two date parameters?
Hi all,
I have a query that fetches records based on the two date parameters defined (Startdate and Enddate).
If the Startdate is 2014-12-01 and the Enddate is 2014-12-12, I want to pull records outside these two date parameters, that is 2014-09-01 and 2014-11-30.
I want to add up the records from 2014-09-01 and 2014-11-30 and include them in one of the columns in my report.
I tried using this query:
SUM(CASE WHEN FilteredIncident.Statuscodename IN ('QUEUED', 'ASSIGNED') AND (EnteredOn >= '2014-09-01' AND EnteredOn<= @StartDate) THEN 1 ELSE 0 END) AS OpenRecords
Please help with any ideas..thanksPlease follow basic Netiquette and post the DDL we need to answer this. Follow industry and ANSI/ISO standards in your data. You should follow ISO-11179 rules for naming data elements. You should follow ISO-8601 rules for displaying temporal data. We need
to know the data types, keys and constraints on the table. Avoid dialect in favor of ANSI/ISO Standard SQL. And you probably need to read and download the PDF for:
https://www.simple-talk.com/books/sql-books/119-sql-code-smells/
There is no such crap as a “status_code_name” in RDBMS. It has to be a “<something in particular>_status”; think about how silly that data element name is! Want to keep going and have a “status_code_name_value_id”? LOL!
The name “Filtered_Incident” is also wrong. Tables are sets, so unless you have only one element in this set, the table name should be a plural or (better) collective name. But a better question is why did you split out “Filtered_Incidents” from “Incidents”?
Would you also split “Male_Personnel” and “Male_Personnel” from “Personnel”?
Get a book on data modeling and learn some basics.
>> I have a query that fetches records [sic: rows are nor records] based on the two date parameters defined (report_start_date and report_end_date). If the report_start_date is 2014-12-01 and the report_end_date is 2014-12-12, I want to pull records [sic]
outside these two date parameters, that is 2014-09-01 and 2014-11-30. I want to add up the records [sic] from 2014-09-01 and 2014-11-30 and include them in one of the columns in my report. <<
Having no DDL and no sample data makes this hard. Does your boss make you program without any documentation, DDL, etc? This spec is vague; you say to do a total, but show a count, etc.
One of the many nice things about DATE data types is that the BETWEEN predicate works with them, so you can quite writing 1960's BASIC predicates with primitive logic operators.
Here is a guess:
SELECT SUM(CASE WHEN incident_date BETWEEN '2014-09-01'
AND @report_start_date THEN 1 ELSE 0 END)
AS open_record_cnt
FROM Incidents
WHERE incident_status IN ('QUEUED', 'ASSIGNED')
AND incident_date <= @report_end_date;
--CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
in Sets / Trees and Hierarchies in SQL -
Hi
I;m having so itunes match issues.
I ran itunes match. I';ve since foumd, I think:
a) if incorrect genre etc data has been uploaded to icloud, I cannot find how to delete it and reupload correct song and data. I am presently assuming that once songs are matched, the meta-data, genre, etc etc thats uploaded is now treated as 'the master' by itunes.
Question: if I want to change the genre on a track, whats the correct way to overwrite the data held in the icloud? if I delete the track i just bring down the old incorrect genres again!
b) I've deleted from the icloud all my music tracks (select all, delete, Y to delete tracks from icloud). I have turned off itunes match on my ipad, iphone, appletv mac and windows pc. I've deleted the itunes library files too from mac and pc in desperation. But hours /days later, when I turn on itunes match on mac or pc, it tells me i still have 9k tracks in icloud. All with the old, wrong meta-data ready to mess up my [now-corrected] files back on my pc!
I cannot delete them no matter what I do. The number of tracks in the cloud was 12k, now 9k so its reducing, by about 1500 tracks a day.
does it really take itunes servers that long to delete the tracks I've asked it to delete, or am i seeing garbage data?
is there a quicker way of deleting all my itunes match library, songs and data?Thanks for the helpful replies.
My original intention was to 'upgrade' some of my tracks. In those cases, I'd iTunes matched, and the process was completed.
I later downloaded those new tracks but found during that process, the tracks I was downloading had the old metadata.
I guess the idea there is to update the local genres, 'update iTunes match' and wait. How long, I don't know but that's the current solution,yes ?
Thing is I was trying to get my local library updated and, as I thought it, back in order, but was finding my local genre updates were being overwritten. I had a lot of tracks that kept on stating they could be downloaded so I thought the new files hadn't been copied down...
I think I was trying to get match to do too many things at once.
Deleting my tracks in the cloud? Was an act of desperation to start over with all known good genres locally and to re- upload, start over. I'd read in some forum or other to do that, I guess that was wrong,
Ok so I will.. Err,
A wait and see how long it takes to delete my tracks in the cloud ie let it complete the process I'd already started.
B Correct my local tracks genres
C Wait til those changes propagate up to the iCloud ( or whatever the right term is)
Q1: How long should I wait for A and C to complete?
Q2: Is it 100% definite that local genre changes will get copied up to the cloud, when I 'update iTunes match'?
I don't see any animated cloud icon, but I will watch for it!
Thanks again
Iain -
Same set of Records not in the same Data package of the extractor
Hi All,
I have got one senario. While extracting the records from the ECC based on some condition I want to add some more records in to ECC. To be more clear based on some condition I want to add addiional lines of data by gving APPEND C_T_DATA.
For eg.
I have a set of records with same company code, same contract same delivery leg and different pricing leg.
If delivery leg and pricing leg is 1 then I want to add one line of record.
There will be several records with the same company code contract delivery leg and pricing leg. In the extraction logic I will extract with the following command i_t_data [] = c_t_data [], then sort with company code, contract delivery and pricing leg. then Delete duplicate with adjustcent..command...to get one record, based on this record with some condition I will populate a new line of record what my business neeeds.
My concern is
if the same set of records over shoot the datapackage size how to handle this. Is there any option.
My data package size is 50,000. Suppose I get a same set of records ie same company code, contract delivery leg and pricing leg as 49999 th record. Suppose there are 10 records with the same characteristics the extraction will hapen in 2 data packages then delete dplicate and the above logic will get wrong. How I can handle this secnaio. Whether Delta enabled function module help me to tackle this. I want to do it only in Extraction. as Data source enhancement.
Anil.
Edited by: Anil on Aug 29, 2010 5:56 AMHi,
You will have to do the enhancement of the data source.
Please follow the below link.
You can write your logic to add the additional records in the case statement for your data source.
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c035c402-3d1a-2d10-4380-af8f26b5026f?quicklink=index&overridelayout=true
Hope this will solve your issue.
Maybe you are looking for
-
Blue Screen and error 14 to Restore
my iphone 5s present esta Problem and remains stuck in DFU mode, so WHEN I connect a Blue Screen Restore it and THEN Presents Presents Error 14 ... someone help me with this problem
-
Office X apps unexpectedly quit
When I launch Word, Excel or Powerpoint by clicking the icon on the dock they will start momentarily then "Unexpectedly Quit". If I double click a document from any of those apps I get a "Operation could not be completed" and an error code -10660. If
-
Location JDBC jar file while building Jetspeed2
Hi, At the moment I'm trying to get Jetspeed2 installed. Reading the manuals it seems soooo easy. I installed Jetspeed using the default installation method, and wanted to change the database from derby to MySQL. To change the database from derby to
-
Transfer my movie from iTunes to video app
I purchased tv shows from iTunes. I'd like to watch them offline. How do I transfer them from iTunes to my video app?
-
Newbie requires a few pointers to get started
Just starting out on the Mac and the development off (I've chosen Cocoa and got a book 'Learning Cocoa with Objective-C), and I'm trying to get my head around where to start. I'm not new to development, I'm currently a Java developer amongst other, a