Full Load  with deletion selections  in Data Mart

Hi,
I have a DataMart INFOCUBE “A” to INFOCUBE “B”, Full loads must be diary. But the infocube “A” have snapshot, by the zvalid characteristic (AAAA.MM) Eg. 2006.06.
And the INFOCUBE “B” dosen’t have , it needs the last snapshot from INFOCUBE “A”.
I am using a delete selection (“deleted request from infocube after update”) from the infopackage with “infosources are the same ” checked, “Datasources are the same” checked, “Source Systems are the same” checked, and with the “selections are” same or more comprehensive, and I develop a abap routine in the data selection of the characteristic zvalid to create the  year.month necessary for the extraction and i have other selection on 0project characteristic. But when beginning a new month, it do not delete the last request, it inserts the new one. ( even when  the “selections are” overlapping).
Example that what I need:
31.01.2006
In the infocube A.
0project| Zvalid |    amount
PROJ1   |12.2005  |  300
PROJ1   |01.2006  |100
In the infocube B
0project |Zvalid  |   amounT
PROJ1|01.2006     |100
01.02.2006
In the infocube A.
0project |Zvalid   |  amount
PROJ1|12.2005  | 300
PROJ1| 01.2006  | 100
PROJ1| 02.2006  |  200
In the infocube B
0project |Zvalid   |  amount
PROJ1 |02.2006     |    100
What is the correct combination in the “deleted request from infocube after update”?
I would appreciate yor help.
Regars
Victoria Leó

Do you upload the cube from other sources too? Otherwise there is a flag in the InfoPackage that is called 'Delete entire content of data target'.
It's under Data targets and you can change it if you select 'Select Data Targets'.
Best regards
   Dirk

Similar Messages

  • Delta update data error. Can we do a full load with bad records?

    Hello,
    We are working with SAP BW 7.0. And we had a problem with a delta update. It has made the delta update correctly, however, a register is loaded incorrectly, when this register was correct in the source system.
    We just made a load of that record and now we have it right in the master data. But we must now update the InfoCube. The data came up with a delta load, and now to load only this record we must make a full load (with only 1 register), then in the infocube make a delete selection of the wrong record.
    The problem is that we have doubts about how this would affect the load delta, because loads are scheduled for each day and then lost as the source of where to start the next load and we have problems with delta loads the next few days.
    Thank you.

    hi,
    What is your delta extractor (LIS or not LIS), What is you target (DSO or cube).
    depending on your soruce and target procedure is not the same but you can do it in every cases :
    in case of not LIS
    just reload with full IP to PSA
    in case of LIS
    delete setup tables
    do a restructuration for your record (if you can with selection condition)
    in case of cube in BW
    do a simple full upload
    in case of DSO
    do a full upload in repair mode (if dataflow is 3.x) else just use DTP.
    But if your target is DSO and DSO load other infocube after be sure that they are not corrupted
    Cyril

  • Full Load" and "Full load with Repair full request"

    Hello Experts,
    Can any body share with me what is the difference between a "Full Load" and "Full load with Repair full request"?
    Regards.

    Hi......
    What is function of full repair?? what it does?
    How to delete init from scheduler?? I dont see any option like that in infopackage
    For both of you question there is a oss note 739863-Repairing data in BW ..........Read the following.....
    Symptom
    Some data is incorrect or missing in the PSA table or in the ODS object (Enterprise Data Warehouse layer).
    Other terms
    Restore data, repair data
    Reason and Prerequisites
    There may be a number of reasons for this problem: Errors in the relevant application, errors in the user exit, errors in the DeltaQueue, handling errors in the customers posting procedure (for example, a change in the extract structure during production operation if the DeltaQueue was not yet empty; postings before the Delta Init was completed, and so on), extractor errors, unplanned system terminations in BW and in R/3, and so on.
    Solution
    Read this note in full BEFORE you start actions that may repair your data in BW. Contact SAP Support for help with troubleshooting before you start to repair data.
    BW offers you the option of a full upload in the form of a repair request (as of BW 3.0B). If you want to use this function, we recommend that you use the ODS object layer.
    Note that you should only use this procedure if you have a small number of incorrect or missing records. Otherwise, we always recommend a reinitialization (possibly after a previous selective deletion, followed by a restriction of the Delta-Init selection to exclude areas that were not changed in the meantime).
    1. Repair request: Definition
    If you flag a request as a repair request with full update as the update mode, it can be updated to all data targets, even if these already contain data from delta initialization runs for this DataSource/source system combination. This means that a repair request can be updated into all ODS objects at any time without a check being performed. The system supports loading by repair request into an ODS object without a check being performed for overlapping data or for the sequence of the requests. This action may therefore result in duplicate data and must thus be prepared very carefully.
    The repair request (of the "Full Upload" type) can be loaded into the same ODS object in which the 'normal' delta requests run. You will find this request under the "Repair Request" option in the InfoPackage (Maintenance) menu.
    2. Prerequisites for using the "Repair Request" function
    2.1. Troubleshooting
    Before you start the repair action, you should carry out a thorough analysis of the possible cause of the error to make sure that the error cannot recur when you execute the repair action. For example, if a key figure has already been updated incorrectly in the OLTP system, it will not change after a reload into BW. Use transaction RSA3 (Extractor Checker) in the source system for help with troubleshooting. Another possible source of the problem may be your user exit. To ensure that the user exit is correct, first load a user exit with a Probe-Full request into the PSA table and check whether the data is correct. If it is not correct: Search for the error in the exit user. If you do not find it, we recommend that you deactivate the user exit for testing purposes and request a new Full Upload. It If the data arrives correctly, it is highly probable that the error is indeed in the user exit.
    We always recommend that you load the data into the PSA table in the first step and check the result there.
    2.2. Analyze the effects on the downstream targets
    Before you start the Repair request into the ODS object, make sure that the incorrect data records are selectively deleted from the ODS object. However, before you decide on selective deletion, you should read the Info Help for the "Selective Deletion" function, which you can access by pressing the extra button on the relevant dialog box. The activation queue and the ChangeLog remain unchanged during the selective deletion of the data from the ODS object, which means that the incorrect data is still in the change log afterwards. After the selective deletion, you therefore must not reconstruct the ODS object if it is reconstructed from the ChangeLog. (Reconstruction is usually from the PSA table but, if the data source is the ODS object itself, the ODS object is reconstructed from its ChangeLog). You MUST read the recommendations and warnings about this (press the "Info" button).
    You MUST also take into account the fact that the delta for the downstream data targets is created from the changelog. If you perform selective deletion and then reload data into the deleted area, this may result in data inconsistencies in the downstream data targets.
    If you only use MOVE and do not use ADD for updates in the ODS object, selective deletion may not be required in some cases (for example, if incorrect records only have to be changed, rather than deleted). In this case, the DataMart delta also remains intact.
    2.3. Analysis of the selections
    You must be very precise when you perform selective deletion: Some applications do not provide the option of selecting individual documents for the load process. Therefore, you must first ensure that you can load the same range of documents into BW as you would delete from the ODS object. This note provides some application-specific recommendations to help you "repair" the incorrect data records.
    If you updated the data from the ODS object into the InfoCube, you can also delete it there using the "Selective deletion" function. However, if it is compressed at document level there and deletion is no longer possible, you must delete the InfoCube content and fill the data in the ODS object again after repair.
    You can only perform this action after a thorough analysis of all effects of selective data deletion. We naturally recommend that you test this first in the test system.
    The procedure generally applies for all SAP applications/extractors. The application determines the selections. For example, if you cannot use the document number for selection but you can select documents for an entire period, then you are forced to delete and then update documents for the entire period in the data target. Therefore, it is important to look first at the selections in the InfoPackage exactly before you delete data from the data target.
    Some applications have additional special features:
    Logistics cockpit: As preparation for the repair request, delete the SetUp table (if you have not already done so) and fill it selectively with concrete document numbers (or other possible groups of documents determined by the selection). Execute the Repair request.
    Caution: You can currently use the transactions that fill SetUp tables with reconstruction data to select individual documents or entire ranges of documents (at present, it is not possible to select several individual documents if they are not numbered in sequence).
    FI: The Repair request for the Full Upload is not required here. The following efficient alternatives are provided: In the FI area, you can select documents that must be reloaded into BW again, make a small change to them (for example, insert a period into the assignment text) and save them -> as a result, the document is placed in the delta queue again and the previously loaded document under the same number in the BW ODS object is overwritten. FI also has an option for sending the documents selectively from the OLTP system to the BW system using correction programs (see note 616331).
    3. Repair request execution
    How do you proceed if you want to load a repair request into the data target? Go to the maintenance screen of the InfoPackage (Scheduler), set the type of data upload to "Full", and select the "Scheduler" option in the menu -> Full Request Repair -> Flag request as repair request -> Confirm. Update the data into the PSA and then check that it is correct. If the data is correct, continue to update into the data targets.
    And also search in forum, will get discussions on this
    Full repair loads
    Regarding Repair Full Request
    Instead of doing of all these all steps.. cant I reload that failed request again??
    If some goes wrong with delta loads...it is always better to do re-init...I mean dlete init flag...full repair..all those steps....If it is an infocube you can go for full update also instead of full repair...
    Full Upload:
    In full upload all the data records are fetched....It is similar to full repair..Incase of infocube to recover missed delta records..we can run full upload.....but in case of ODS  does'nt support full upload and delta upload parallely...so inthis case you have to go for full repair...otherwise delta mechanism will get corrupted...
    Suppose your ODS activation is failing since there is a Full upload request in the target.......then you can convert full upload to full repair using the program : RSSM_SET_REPAIR _ FULL_FLAG
    Hope this helps.......
    Thanks==points as per SDN.
    Regards,
    Debjani.....
    Edited by: Debjani  Mukherjee on Oct 23, 2008 8:32 PM

  • Cube Compression - How it Affects Loading With Delete Overlapping Request

    Hi guys,
    Good day to all !!!
    Our scenario is that we have a process chain that loads a data to infocube and that has delete overlapping step. I just want to ask how does the cube compression affects the loading with delete overlapping request. Is there any conflict/error that will raise? Kindly advice.
    Marshanlou

    Hi,
    In the scenario you have mentioned:
    First the info cube would be loaded.
    Next when it goes to the step i.e delete overlapping request:  in this particular step, it checks if the request is overlapping (with the same date or accd to the overlapping condition defined in the infopackage, if the data has been loaded). 
    If the request is overlapping, then only it deletes the request. Otherwise, no action would be taken.  In this way,it checks that data is not loaded twice resulting in duplicasy.
    It has nothing to do with compression and in no way affect compression/loading. 
    Sasi

  • Full load with breaking down the loads in particular range - any progrom?

    Hello all,
    We are trying to do the first full load to our BW production system. We are doing Full load and then we are going to do the Init without data transfer to get the delta going.
    The full load is so big that we have to break down the loads by particular ranges, we have done it once previosuly by creating 30 infopackages and typing the consecutive ranges in the Infopackages selection criteria.
    I am just wondering, is there any other way we can do it which is much more easier, any code or program rather than having to create so many Infopackages and stuff.
    Please help with any suggestions/advice.
    Thanks in advance,

    Hello sasi,
    i am planning to use process chain, but what I am doing is since the load is so big I am putting range in the data selection of each infopackage (e.g. STNUMBER = 1000000000 to 1000005000) and there are such infopackages which gets the range in increasing order of 5000. (e.g. second infopackage will be STNUMBER = 1000005001 to 1000010000)
    So I was wondering if there was easy way to do this instead of creating like this 30 infopackages manually and putting them in process chain.
    anymore suggestions
    Thanks

  • Problem with deleting old master Data from 0Material

    Hello, we try to delete the master data from 0Material.
    But we get the message that some data are used and can not be deleted.
    We can see the error message in slg1:
    "Master data 000000000100000007 (SID                                                   7) is used in /BI0/HMATERIAL; is not deleted"
    "Master data R23188 (SID                                                5316) is used in /BI0/D0RT_C071; is not deleted"
    Can we just clear these tables? Is it enough for resolving the problem?
    Thank You.

    Hi,
    no you just can't "clear" these table unless you want to screw your datawarehouse....
    the first message means that the material with SID 7 is used in a hierachy of 0MATERIAL ==> adjust the hierarchy first.
    the second message indicates that the material with SID 5316 is used in the ICube 0RT_C07 and you CANNOT delete master data if it is used in other objects.
    First run RSRV for this dimension in order to see if the DIMID is really used by the fact table. Perform the following test "Entries Not Used in the Dimension of an InfoCube " with your cube and dim.
    if the DIMID is not used then "correct" the error; you'll then be able to delete your SID.
    If the DIMID is used then you'll have to delete your cube data if you really want to delete this material; otherwise you'll have to keep it.
    Finally rerun you master data deletion.
    hope this helps...
    Olivier.
    Edited by: Olivier Cora on Jul 31, 2008 9:07 AM

  • DAC Commands for Incremental and Full load with @DAC_* parameters

    I have my DAC server on Linux running 10.1.3.4.1 with informatica 9.0.1 hotfix 2 also running on the same server. I'm doing a full ETL load for HR 11.10.5 and I get two failed tasks pertaining to parameter files:
    @DAC_SDE_ORA_PersistedStage_WorkforceEvent_Performance_Mntn_CMD
    @DAC_SDE_ORA_PayrollFact_Agg_Items_CMD
    I can see these parameter files are in the SrcFiles directory with the proper names.
    I did some googling and came across a couple threads with the same problem and the solution was to download a patch (8760212) which was marked obsolete and replaced by 10052370 which on oracle support says is ALSO obsolete. I'm at a loss here, where can I find this patch or a patch to replace it so I can fix this problem? Any ideas? Thanks.

    HI,
    You can download the patch from this location https://support.oracle.com/CSP/ui/flash.html and it needs login credentials in order to download the patch.
    Thanks,
    Navin Kumar Bolla

  • Table loading with empty selected row

    hi i have below table ,when the page load it has empty row created, i what it to show that message emptyText=No data to Display when the page load,the table was created based on the bean,am in jdeveloper 11.1.1.6.0,i don't what to use #{!adfFacesContext.initialRender} becuase it does not allow me to add value in the table
    <af:table
                              var="row" rows="#{bindings.addmemberBean.rangeSize}"
                              emptyText="#{bindings.addmemberBean.viewable ? 'No data to display.' : 'Access Denied.'}"
                              fetchSize="#{bindings.addmemberBean.rangeSize}"
                              rowBandingInterval="0" id="t1"
                              binding="#{pageFlowScope.MemberBean.tempTable}"
                              value="#{bindings.addmemberBean.collectionModel}"
                              rowSelection="multiple"
                              selectionListener="#{bindings.addmemberBean.collectionModel.makeCurrent}">
                      <af:column sortProperty="name" sortable="false"
                                 headerText="#{bindings.addmemberBean.hints.name.label}"
                                 id="c6" width="106">
                        <af:outputText id="ot2" value=" #{row.name}"/>
                      </af:column>
                      <af:column sortProperty="surname" sortable="false"
                                 headerText="#{bindings.addmemberBean.hints.surname.label}"
                                 id="c4" width="104">
                        <af:outputText value="#{row.surname}" id="ot3"/>
                      </af:column>
                      <af:column sortProperty="emailaddress" sortable="false"
                                 headerText="#{bindings.addmemberBean.hints.emailaddress.label}"
                                 id="c5" width="105">
                        <af:outputText value="#{row.emailaddress}" id="ot4"/>
                      </af:column>
                      <af:column sortProperty="firstname" sortable="false"
                                 headerText="#{bindings.addmemberBean.hints.firstname.label}"
                                 id="c3" width="105">
                        <af:outputText value="#{row.firstname}" id="ot1"/>
                      </af:column>
                      <af:column id="c7" headerText="AddUser">
                        <af:selectBooleanCheckbox
                                                  label="Label 1" id="sbc1"/>
                      </af:column>
                      <af:column headerText="#{bindings.addmemberBean.hints.name.label}"
                                 id="c8" visible="false">
                        <af:inputText value="#{row.bindings.name.inputValue}"
                                      label="#{bindings.addmemberBean.hints.name.label}"
                                      required="#{bindings.addmemberBean.hints.name.mandatory}"
                                      columns="#{bindings.addmemberBean.hints.name.displayWidth}"
                                      maximumLength="#{bindings.addmemberBean.hints.name.precision}"
                                      shortDesc="#{bindings.addmemberBean.hints.name.tooltip}"
                                      id="it2">
                          <f:validator binding="#{row.bindings.name.validator}"/>
                        </af:inputText>
                      </af:column>
                    </af:table>and i don't what to user
    public void emptytable(){
            ViewObjectImpl vo = this.getAddMemberVo1();
            vo.executeEmptyRowSet();
        }because am not using a view,am using a bean
    Edited by: adf0994 on 2012/11/06 10:38 AM
    Edited by: adf0994 on 2012/11/06 10:41 AM
    Edited by: adf0994 on 2012/11/06 10:44 AM

    public Beantest() {
    super();
    addmemberBean = new ArrayList<AddmemberBean>();
    This way, the list will be completely empty when your bean is created.
    And thus, your table shouldn't contain any data.

  • Lenovo 24 inch touch screen not full screen with 1920x1080 selected (L2461xwA)

    I have the L2461xwA Lenovo Widescreen Touchscreen display. When I set the resolution to 1680x1050, it is full screen as expected. When I set the resolution to 1920x1080, which is the native resolution, it is NOT full screen. There is about a 1 inch black border on the left and right sides and about a 3/4 inch border on the top and bottom when in 1920x1080 resolution. How do I make it full screen?
    (I'm running Windows 7 64-bit and have installed all of the drivers. The touch functionality and camera are working great)
    Thanks
    Solved!
    Go to Solution.

    Nevermind. I resolved it. When I switched DVI ports on my video card, it worked.

  • Full load Concept

    Hi all,
    Can somebody explain me why there are full loads that should not be deleted from the Infocube?
    Theoratically, a full brings all data from the Source System, but we have fulls that only bring the data from a specific day and therefore we need to keep them on the cube/ods.
    Example:
    In our ODS we keep the full requests. If they are deleted the data can not be anymore reported.
    Now we only have data starting from 23.05 and even that we load a full the data previously to this day is not being extracted.
    The extractor is also not bringing the data of the past.
    Is this a extarctor configuration or there´s some way to bring all data that is on the R/3 system?
    Thanks,
    Marta.

    Hi Marta,
    Yes you are very much correct. Full load pulls in all the data from the source system, but it pulls in the data for the restrictions mentioned in infopackage ..
    Now say we are doing a full load to a cube and also to ODS. And say we dont have any selection restriction ..
    As you know ODS has overwrite functionality .. while the cube does nt have it ..
    Now say we do the full load continously .. Now the same records will b e pulled everytime .. And In ODS if we have set overwrite for data values in the update rule, It will overwrite only, and we will get the latest data only .. But in case of cube all these key figure will keep getting added up as the same data is being loaded multiple times ..
    Hence in this case we delete the earlier requests for a cube .. Now say we do full load with some selection restrictios(say, quarter wise, month wise) .. In this case we dont delete the earlier requests if the selection restrictions are mutually exclusive ..
    And for the extractor issue, could you please elaborate on that?
    Hope it helps ..

  • How to load the exist Data Mart Request in to cube.

    Hi Friends,
    I have a scenario... i got DSO and one Cube , iam loading the data from DSO to Cube .
    The number of records are more in the DSO and due to some clculation in the routine i need to load the data year by year(i got 2007 and 2008 year daya). I have loaded the 2007 data to the Infocube and in the DSO i can see the Data Mart Symbal against the Request Id when th load is finished successfully.
    When iam trying to load the 2008 data from DSO to Cube iam getting the "0" records. I realised that the Data Mart symbal is exist in the DSO.
    How to load the 2008 data in this scenario. If i delete the data mart sysbel means iam deleting the Cube request.
    Can any have an idea on this.
    Thaks in advance.

    HI,
    Things are not clear.
    How is loading happening if its delta or full load through DTP or you using 3.5 flow??
    In any cases if you do a full load or full repair based on the year selection it should pick the records from the source...there is nothing to do with the data mart status in the case of full loads.
    Data mart status will come into picture only when you schedule the delta loads.
    Do a full load based on selections on year from the DSO to the cube...no need to delete the data mart or it will bring that request again delta is scheduled...does that request in DSO contains the data for 2008 only ...if yes then you can just delete the data mart status for that and do a delta...if not then do full loads as said.
    Thanks
    Ajeet

  • Delta Load is not pulling any data....Need to Run Full Repair EVERYTIME ? ?

    Dear Experts,
    Today again I need your help gurus.
    The Delta load is running everyday but its not getting any data to BW form CRM. But when the User raises an issue regarding the data , I pull the data by Repair Full load by some Selection ( Same selection of Delta + Transaction No ).
    The selection(For Full Repair Load ) is only the Transaction No and the Sales Organization.
    The selection for the normal Delta is Sales Organization ( 0CRM_SALORG ) 50000609.
    But I am confused about the issue.
    Kindly put some light on this issueu2026
    Thanks,
    Sanjana

    Hi Sanjana:
       Have you applied any SAP Note to fix this problem? Please check if the SAP Note below helps you in solving this issue.
    Note 788172 - "CRM BW adapter: No data in delta with parallel processing"
    Regards,
    Francisco Milán.

  • Does the Full load really remove previous requests and extract new data?

    We wonder if the Full load does remove the previous requests or data and bring over new data?
    For example, we extract data from a R3 table by using Full load.  1st day, we load data to R3 table, then extract the data to BW, then BW data contains only 1st day data.  2nd day, we delete R3 table data and load data for 2nd day only, then extract to BW still using the Full load, then what BW data should contain?  both 1st day and 2nd day data?  or 2nd day data only?
    Any answer?
    Thanks

    Hi Kevin,
      What data the cube should have should be decided by the business requirement. 
      Current data load will delete the existing data only if that option has been selected in the Infopackage, else it is going to get appended.
    You may assign points if helpful ****
    Thanks,
    Raj

  • Full load failed with  [Microsoft][ODBC SQL Server Driver]Datetime field

    Hi,
    we are doing a full load with RDBMS SQLServer.
    It failed due to the below error.
    [Microsoft][ODBC SQL Server Driver]Datetime field overflow. Can you please help
    thank you

    968651 wrote:
    Hi,
    we are doing a full load with RDBMS SQLServer.
    It failed due to the below error.
    [Microsoft][ODBC SQL Server Driver]Datetime field overflow. Can you please help
    thank youhttp://bit.ly/XUL950
    Thanks,
    Hussein

  • How to do a full load in DAC for a particular Module?

    Hi,
    We have one Execution plan loading all the BI Analytics module as of now.
    1)Financials,
    2)Procurement and Spend,
    3)Supply Chain and Order Management and
    4)EAM
    Issue is if  i go to Tools-->ETL Management-->Reset Data Sources  in DAC, it refreshes dates for all the Tables in the warehouse. ( and hence full load for all the Modules happens when i start the execution plan)
    I dont want full load for other modules, just want to do full load for a particular Module lets say EAM Module ( and also want to make sure that this particular full load doesnt create issues with the  data for other modules)
    Let me know how to achieve this.
    Any help in this regard would be highly appreciated.
    Thanks
    Ashish

    I can get a list of all the Facts and dimensions for a particular module from BI Apps Content guide... and then i can go and make refresh dates as Null for those Tables.
    Now should i run the execution plan for one particular module or an Execution plan containing all the modules?
    Cause i got this from someone..
    "The Data models for the different modules have many common dimensions, so a reload of one module could leave the facts in another in an invalid state as the dimension keys would be moved.
    Therefore it should not be expected that you can reload a module in isolation.
    However, you can selectively do a reload of any table by resetting it's refresh date in the DAC console.
    Then the DAC will determine the dependent objects that will also need a full load in order to maintain data consistency."
    Thanks
    Ashish

Maybe you are looking for

  • Which is better Function modules or Include programs?

    Hi, I am working on an enhancement and it has lot of screens with a tree structure on the left. Now we are planning to have each screen to have its PAI/PBO and the processing logic to be in seperate include programs. However in our team we have debat

  • Unable to Install OS X Yosemite from USB Bootable Installer

    I have created a Yosemite USB bootable installer using similar instructions (I changed source and target names as appropriate) as provided to create a Mavericks USB bootable installer - Creating a bootable OS X installer in OS X Mavericks Target disk

  • Can applet use multicast broadcasting?

    Hello Can applet listen to server sending to multiple sockets?. When I tried to run applet exception showed up. Applet would listen for message, server was sending messages in a thread. Could You direct me to some examples which can do such thing?. (

  • Dragging doesn't always get the 'green plus'

    I use drag and drop with exposé so often I have got used to the fact that it only works about half the time. I'm trying to drag images from iPhoto into applications like Pages, Keynote, Word, etc. How come I get the green plus sometimes but not other

  • Hi, My iphone 5 is only displaying the red battery symbol and wont turn on.

    I have tried most of the tips and can't get the serial number to contact support as cannot get info from phone.