Suggestions for eliminating downtime while loading cubes
Hi all,
we are currently looking at multiple daily refreshes of one of our Essbase cubes. One of the challenges I'm currently facing is that the cube is down while it's being loaded. Any users who try to query the cube during this downtime receive the message "Cannot proceed while the cube is being loaded".
I'm sure there must be a way to set up an environment where the cube database is not unavailable during loading times. For example is it possible to copy the cube database to another location after a build, then "deploy" the cube to users from there? That way when the cube is being reloaded, it would only affect the "deployed" cube when the build is finished and it's being copied to the deployment area.
I would welcome advise from anyone on how to avoid this kind of downtime as it will be unfeasible to implement multiple daily refreshes (the build already takes about an hour and a half... that's a lot of down time already) without some kind of strategy whereby the previous build is only discarded completely once the current build is ready.
I'd greatly appreciate any advice.
Thanks and kind regards,
Mickey G
Hi Mickey,
The better solution for that will be having 2 databases in the Applications. For example, Basic1 and Basic2.
Now users in their business hours entered data in Basic1 database. Before doing cube build,You have to archive all data in a text file. Now in the cube build load step, swap the cubes(like a=b and b=a. Here, you have to disconnect the connections for Basic1 and shift the connections to the Basic2 database ). By doing this, Basic2 database is available to the users.(Here,both database should be in synch with outline structure) and they can do the transactions on Basic2 database.
Basic1 database is now build with fresh data.
Basic2 database is carrying one day old data.
And once the cube build is done, swap the cubes.
Please let me know if you understand it or not. I wil try to explain in a better manner.
Thanks & Regards,
Upendra.
Similar Messages
-
Need suggestions for concurrent reads while deleting all entities
Hello,
we need some suggestions for the following use case, as we cannot seem to find the correct transaction locking combination to meet our needs.
We are using BDB JE 4.1.10, and the entityStore is transactional. In a nutshell, BDB JE is used as a Cache for specific data that is pulled and updated regularly from a source Oracle DB. The application that holds BDB is a realtimel app, and response time is critical.
In order to avoid having to merge (Insert/update/delete) entities for some of the tables that are very static and small (a few hundred records that might change every day or so), we are trying to simply delete all records (with an EntityCursor loop, since there is no 'deleteAll' that we could find) and reinsert all data every 5 minutes, and then committing when the whole process is complete. This should always be very quick, but we are not immune to timeouts from the source Oracle DB, so the transaction can be long.
Ideally, while the delete/insert is happening, we want any concurrent read operations to return the old data (before the delete). We have made a test that locks the updater thread between the 'delete all' and 'update all' methods, so that the delete cursor is closed, but the Tx is not yet commited. We have tried the following for our reader thread:
1- If we get() a deleted entity with a LockMode.READ_UNCOMMITTED, we get a null entity back.
2- If we get() a deleted entity with LockMode.READ_COMMITTED, we get a LockTimeoutException.
We have also considered simply using truncate on the EntityStore, but this requires closing and reopening the DB, so this would not work for our need to always return a value, instantly.
Any ideas, suggestions on how we could do this in a simple manner?
Thanks,
MaxHello Max,
I understand the issue you're describing.
I think the best performing solution by far is to load the new data into new databases while servicing reads from the old databases. When the load is complete, reads can be diverted to the new databases, and the old databases can then be removed.
Unfortunately, this is easier to do with the base API than with the DPL, because the DPL is managing the underlying (base API level) databases for you.
However, even using the DPL, this approach could be very straightforward if you happen to be keeping this data set (the one that needs to be reloaded) in a separate EntityStore, or you can change your application to do so. If so, you can create a new (empty) EntityStore with a different name and use this store for loading the new data. When the load is complete, divert reads to the new EntityStore object and remove all databases for the old EntityStore.
If this is impractical, please explain, and I'll try to suggest a different solution. If you cannot use a separate EntityStore for this particular data set, then the solution may be more complex.
--mark -
Suggestions for optimal solution to load application with dynaminc view
Hi Experts,
I have a Web service which is providing me data from database as well as data which will help to create UI of the view.
In my view i have to ceate tabs in a tabstrip. Number of tabs are dynamic depending on data from web service. In each tab i have to create tables with dynamic columns and some additional text views and edits. Now, tabs and columns of tables, coming from web service are in random order and i need to sort the tabs and tables depending on values filled in a database table. this database table is like a configuration table.
Creating view in a random order is done and for createing a sorted view i am planning to go with "bucket sort". Now my question here is as every thing is dynamic it will take lot of time for aplication to load which is not desired.
I need your expert suggestions on how can i reduce the time view takes to Load entire application.
Regards
PranavHi Friend,
There are may ways to optimize the performance of the application,
First we need to concentrate on the basic web dynpro concepts like componentization. By implementing this concept we can achieve the gol. we need to know whether the monolithic component architecture or Multi component architecture which is suitable and the communication flow between them. More on this you can find here
[Use components in bigger Web Dynpro projects (componentization) |http://wiki.sdn.sap.com/wiki/display/WDJava/UsecomponentsinbiggerWebDynproprojects+(componentization)]
second you can check the proper implementation of the web DynPro hook methods. if this is done properly then most of the application flow will go very smoothly. there is one very good presentation available on this at SDN here
http://www.sdn.sap.com/irj/scn/elearn;jsessionid=(J2EE3417200)ID0393138650DB00661116440296043375End?rid=/library/uuid/20e72f90-efea-2b10-0382-8dd420ee555e&overridelayout=true
You should also check these link it will be very Helpful.
[Best Practices for Optimizing Web Dynpro Java Application Performance|http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/604ddc2f-ec9c-2b10-1682-be37e1c62dee?quicklink=index&overridelayout=true]
[Best-Practices for Building State-of-the-Art Web Dynpro Java User Interfaces - Exercises|http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/50568365-ba9e-2b10-1188-a612a20edf31?quicklink=index&overridelayout=true]
Regards
Jeetendra -
Error in code page mapping for Source system whil loading the data from ECC
HI Gurus,
I am working in a implementation project, Recently our BI sand box is up, when i am doing my load from 0comp_code_attr it is throwing an error "Error in code page mapping for source system"(This is my first load from ECC).
In details tab it is showing as the data is sent from the source system but the data is not reaching to the PSA.
Please let me know if there are any settings needs to be made.
Many thanks in Advance
JagadeeshHI V,
Thanks for your quick response. I did it but it didn't resolve the issue. since the system id which i am having is 3 digits(LEC) but there it is taking only 2 digits, so i clicked on the button called Propose system ids, it has praposed LE, but the issue is stil there.
Do we need to do any settings in LBWE??
Thanks and Regards
Jagadeesh -
Invalid Number Error for Decimal Field While Loading Data
I am loading a delimited text file using the SQL* loader however I am reciving an error in my decimal fields. When a decimal field only has leading zeros before the decimal point I receive invalid number error. Below will clarify:
i.e.) 00000000.30 [*Invalid number*]
i.e.) 00046567.45 [*Valid number*]
i.e.) 00000001.00 [*Valid number*]
I've tried setting the precision/scale in the table, tried declaring it a decimal field instead of number, none of these methods fixed the issue. Any help I would really appreciate.
POLICY_NUMBER NUMBER,
EFFECTIVE_DATE DATE "YYYYMMDD" NULLIF EFFECTIVE_DATE = '',
TRANSACTION_DATE DATE "YYYYMMDD",
TRANSACTION_AMOUNT DECIMAL EXTERNAL, -- Tried TRANSACTION_AMOUNT DECIMAL EXTERNAL (10) & TRANSACTION_AMOUNT NUMBER
MF_TRX_CODE NUMBER,
USER_ID CHAR,
GROUP_NUMBER NUMBER,
EXPIRATION_DATE DATE "YYYYMMDD" NULLIF EXPIRATION_DATE = '',
BILL_NUMBER NUMBER,Any help is greatly appreciated. Thanks before hand.Hi,
Connected to Oracle Database 10g Express Edition Release 10.2.0.1.0
Connected as hr
SQL> SELECT * FROM TEST;
TRANSACTION_AMOUNT
SQL> SELECT * FROM TEST;
TRANSACTION_AMOUNT
11000,00
293,37
2000,00
1134,32
0,30
SQL>Between the selects I loaded the table with sql*loader using...
Load Data
INFILE *
APPEND
INTO TABLE TEST
FIELDS TERMINATED BY '|'
TRAILING NULLCOLS
TRANSACTION_AMOUNT DECIMAL EXTERNAL
BEGINDATA
00011000.00
00000293.37
00002000.00
00001134.32
00000000.30The log is
SQL*Loader: Release 10.2.0.1.0 - Production on Tue Dec 23 17:23:47 2008
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: test.ctl
Data File: test.ctl
Bad File: test.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table TEST, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
TRANSACTION_AMOUNT FIRST * | CHARACTER
Table TEST:
5 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 16512 bytes(64 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 5
Total logical records rejected: 0
Total logical records discarded: 0
Run began on Tue Dec 23 17:23:47 2008
Run ended on Tue Dec 23 17:23:50 2008
Elapsed time was: 00:00:02.86
CPU time was: 00:00:00.06Regards, -
Dear All,
In our production system, we have transported our first infocube and dataflow 0FIAR_C03. (3.x dataflow)
First the DSO 0FIAR_O03 was loaded and the data was activated.
Then when we try to load the data ,from the DSO to the infocube 0FIAR_C03,
when we go to RSMO to check the process load, it is showing yellow colour for a very long time, and finally failing with the error -
"Short dump in the Warehouse
Diagnosis
The data update was not finished. A short dump has probably been logged in BI. This provides information about the error.
System Response
"Caller 70" is missing. "
But in RSMO. immediately after we start the load, when we check in the Details TAB, we see that , for the datapackage in the Update TAB, it is showing "Processing end : Missing messages "
Also at this time itself it is throwing a dump in ST22, CX_RSR_X_MESSAGE -
Dump Details are -
Program............. "SAPLRRMS"
Screen.............. "SAPMSSY1 3004"
Screen Line......... 2
68 CALL FUNCTION 'RRMS_MESSAGE_HANDLING'
69 EXPORTING
70 i_class = 'BRAIN'
71 i_type = l_type
72 i_number = '299'
73 i_msgv1 = i_program
74 i_msgv2 = i_text.
75 ELSE.
76 DATA: l_text TYPE string,
77 l_repid TYPE syrepid.
78
79 l_repid = i_program.
80 l_text = i_text.
81
>>>>> RAISE EXCEPTION TYPE cx_rsr_x_message
Can you please let me know,what the error might be?
With Warm Regards,
VineethHi Vuineeth,
The characteristic 0REQUEST is used in DataStore objects and characteristic 0REQUID is used in InfoCubes. Both characteristics must share the same master data ID (SID) number range. If this is not the case the following problems may occur:
transfer the InfoObject 0REQUEST from the content again:
Transaction RSOR -> BI Content -> Object Types -> under InfoObjects select 0REQUEST and transfer it to the right (note the following settings: "Only Necessary Objects"; collection mode: "Collect Automatically").
For more details check SAP Note:1157796
Hope it helps
SujanR -
Change form Mode as Add for master data while Loading
Hi,All
I have a master data form(user form). By default it will load and shows Find mode but i dont want find mode i want add mode after loading
plz suggest me a good idea..
By
Firos.CHi,
After the form is loaded in the load event, change the form mode.
objForm.Mode = SAPbouiCOM.BoFormMode.fm_ADD_MODE
Or
objSBOAPI.SBO_Appln.ActivateMenuItem("1282")
Hope it helps,
Vasu Natari. -
Firefox for Android crashing while loading website
Hello all.
I've purchased an Android 4.0 smartphone recently - LG L5.
I then installed the Firefox version for Android and tried to access a webpage I've been building. Though when it's still loading, Firefox crashes.
After browsing some related articles, I installed Firefox Nightly and the issue persists.
What is exactly causing it to crash? Other webpages work just fine. Thanks in advance.We're sorry to hear that Firefox is crashing. In order to assist you better, please follow the steps below to provide us crash ID's to help us learn more about your crash.
#Enter about:crashes in the address bar (that's where you enter your website URL) and press Enter. You should now see a list of submitted crash reports.
#Copy the 5 most recent crash ID's that you see in the crash report window and paste them into your response here.
Thank you for your cooperation! -
Suggestions for eliminating hissing noise from Synced Synth Recordings?
When recording an external synth, the volume levels are very low. I connect the midi cables to my interface (Alesis io14). I also connect the audio ins and outs. This has helped me to produce other music from composers who want to dump from their synths. We do a track at a time.
Unfortunately, when recording this way, the volume is lacking. And if we want to use the original performance from that synth, I always have to overcompress the audio tracks, which then creates hissing. I can't think of anything to do as I always make sure that the keyboard volume is maxed and all keyboard tracks are turned up as well. Any suggestions on getting rid of the hiss?Get some good preamps or a Firewire Mixer with good preamps to amplify the signal before it hits the A/D converter.
To eliminate existing hiss check out iZotopes RX Plug In, the denoiser section has a learn function where it takes a fingerprint of the noise to eliminate and then very accurately eliminates that frequency spectrum from the Audio file. Check out the demo:
http://www.izotope.com/products/audio/rx/ -
Issue while loading data from DSO to Cube
Gurus,
I'm facing a typical problem while loading Cube from a DSO. The load is based upon certain conditions.
Though the data in DSO satisfies those condition, but the Cube is still not being loaded.
All the loads are managed through Process Chains.
Would there be any reason in specific/particular for this type of problem?
Any pointers would be of greatest help !
Regards,
Yaseen & SoujanyaYaseen & Soujanya,
It is very hard to guess the problem with the amount of information you have provided.
- What do you mean by cube is not being loaded? No records are extracted from DSO? Is the load completing with 0 records?
- How is data loaded from DSO to InfoCube? Full? Are you first loading to PSA or not?
- Is there data already in the InfoCube?
- Is there change log data for DSO or did someone delete all the PSA data?
Sincere there are so many reasons for the behavior you are witnessing, your best option is to approach your resident expert.
Good luck.
Sudhi Karkada
<a href="http://main.nationalmssociety.org/site/TR/Bike/TXHBikeEvents?px=5888378&pg=personal&fr_id=10222">Biking for MS Relief</a> -
SQL*LOADER ERROR WHILE LOADING ARABIAN DATA INTO UNICODE DATABSE
Hi,
I was trying to load arabic data using sql*loader and the datafile is in .CSV format.But i am facing a error Value to large for a column while loading and some data are not loaded due to this error.My target database character set is..
Characterset : AL32UTF8
National Character set: AL16UTF16
DB version:-10g release 2
OS:-Cent OS 5.0/redhat linux 5.0
I have specified the characterset AR8MSWIN1256/AR8ISO8859P6/AL32UTF8/UTF8 separately in the sql*loader control file,but getting the same error for all the cases.
I have also created the table with CHAR semantics and have specified the "LENGTH SEMANTICS CHAR" in the sql*loader control file but again same error is coming.
I have also changed the NLS_LANG setting.
I am getting stunned that the data that i am goin to load using sql*loader, it is resided in the same database itself.But when i am generating a csv for those datas and trying to load using sql*loader to the same database and same table structure,i am getting this error value too large for a column.
whats the probs basically???? whether the datafile is problemetic as i am generating the csv programmetically or is there any problem in my approach of loading unicode data.
Please help...Here's what we know from what you've posted:
1. You may be running on an unsupported operating system ... likely not the issue but who knows.
2. You are using some patch level of 10gR2 of the Oracle database but we don't know which one.
3. You've had some kind of error but we have no idea which error or the error message displayed with it.
4. You are loading data into a table but we do not have any DDL so we do not know the data types.
Perhaps you could provide a bit more information.
Perhaps a lot more. <g> -
I keep getting this message: AN ERROR OCURRED WHILE LOADING THIS CONTENT
ATV was working fine until i did a version update last week. Since then i have rented two movies and keep getting the same message
I restarted the ATV - no success
I reset my wireless - no success
I reset my ATV - no success
netflix is working fine since i was able to see a movie there. Youtube was also working fine. I think there should be something wrong with iTunes
Any ideas?I keep getting this message: AN ERROR OCURRED WHILE LOADING THIS CONTENT
ATV was working fine until i did a version update last week. Since then i have rented two movies and keep getting the same message
I restarted the ATV - no success
I reset my wireless - no success
I reset my ATV - no success
netflix is working fine since i was able to see a movie there. Youtube was also working fine. I think there should be something wrong with iTunes
Any ideas? -
Error while loading from PSA to Cube. RSM2-704 & RSAR -119
Dear Gurus,
I have to extract about 1.13 million records from set up tables to the MM cube 0IC_C03. While doing, the following two errors occured while loading from the psa to the cube:
Error ID - RSM2 & error no. 704: Data records in the PSA were marked for data package 39 . Here there were 2 errors. The system wrote information or warnings for the remaining data records.
Error ID- RSAR & No. 119: The update delivered the error code 4 .
(Data records for package 39 selected in PSA - 2 error(s). Record 5129 :No SID found for value 'EMN ' of characteristic 0BASE_UOM . Record 5132 :No SID found for value 'EMN ' of characteristic 0BASE_UOM )
I tried to change the defective records in psa by deleting the erraneous field value EMN and tried to load but failed.
Now, my questions are:
How can I resolve the issue ?
(ii) How to rectify the erraneous record ? should we only delete the erraneous field value or delete the entire record from the psa.
(iii) How to delete a record from psa.
Thanks & regards,
Sheeja.Hi,
Data records for package 39 selected in PSA - 2 error(s). Record 5129 :No SID found for value 'EMN ' of characteristic 0BASE_UOM . Record 5132 :No SID found for value 'EMN ' of characteristic 0BASE_UOM
The issue with record no. 5129 and 5132.
In PSA check errorneous records and it will display only the error records, you just edit as per the require and try reloading into cube.
Deleting single record is not possible.
Let us know if you still have any issues.
Reg
Pra -
Error while loading data into cube 0calday to 0fiscper (2lis_13_vdcon)
Hi all,
I m getting following error while loading the data into cube.
"Time conversion from 0CALDAY to 0FISCPER (fiscal year V3 ) failed with value 10081031"
amit shetyeHi Amit,
This conversion problem. Calender not maintained for Fiscal variant "V3", for Year: 1008.
Maintain calender for year: 1008 and transfer global setting from soruce(R/3).
RSA1--> Source systems --> from context menu --> transfer global settings > choose fiscal year variants and calender> execute
Hope it Helps
Srini -
I've installed SQL Server 2012 SP1 + SP server 2012 + SSRS and PowerPivot add-in.
I also configured excel services correctly. Everything works fine but the powerview doesn't work!
While I open an excel workbook consist of a PowerView report an error occurs: "An error occurred while loading the model for the item or data source 'EntityDataSource'. Verify that the connection information is correct and that you have permissions
to access the data source."
error detail:
<detail><ErrorCode xmlns="http://www.microsoft.com/sql/reportingservices">rsCannotRetrieveModel</ErrorCode><HttpStatus xmlns="http://www.microsoft.com/sql/reportingservices">400</HttpStatus><Message xmlns="http://www.microsoft.com/sql/reportingservices">An
error occurred while loading the model for the item or data source 'EntityDataSource'. Verify that the connection information is correct and that you have permissions to access the data source.</Message><HelpLink xmlns="http://www.microsoft.com/sql/reportingservices">http://go.microsoft.com/fwlink/?LinkId=20476&EvtSrc=Microsoft.ReportingServices.Diagnostics.Utilities.ErrorStrings&EvtID=rsCannotRetrieveModel&ProdName=Microsoft%20SQL%20Server%20Reporting%20Services&ProdVer=11.0.3128.0</HelpLink><ProductName
xmlns="http://www.microsoft.com/sql/reportingservices">Microsoft SQL Server Reporting Services</ProductName><ProductVersion xmlns="http://www.microsoft.com/sql/reportingservices">11.0.3128.0</ProductVersion><ProductLocaleId
xmlns="http://www.microsoft.com/sql/reportingservices">127</ProductLocaleId><OperatingSystem xmlns="http://www.microsoft.com/sql/reportingservices">OsIndependent</OperatingSystem><CountryLocaleId xmlns="http://www.microsoft.com/sql/reportingservices">1033</CountryLocaleId><MoreInformation
xmlns="http://www.microsoft.com/sql/reportingservices"><Source>ReportingServicesLibrary</Source><Message msrs:ErrorCode="rsCannotRetrieveModel" msrs:HelpLink="http://go.microsoft.com/fwlink/?LinkId=20476&EvtSrc=Microsoft.ReportingServices.Diagnostics.Utilities.ErrorStrings&EvtID=rsCannotRetrieveModel&ProdName=Microsoft%20SQL%20Server%20Reporting%20Services&ProdVer=11.0.3128.0"
xmlns:msrs="http://www.microsoft.com/sql/reportingservices">An error occurred while loading the model for the item or data source 'EntityDataSource'. Verify that the connection information is correct and that you have permissions to access the
data source.</Message><MoreInformation><Source>Microsoft.ReportingServices.ProcessingCore</Source><Message msrs:ErrorCode="rsErrorOpeningConnection" msrs:HelpLink="http://go.microsoft.com/fwlink/?LinkId=20476&EvtSrc=Microsoft.ReportingServices.Diagnostics.Utilities.ErrorStrings&EvtID=rsErrorOpeningConnection&ProdName=Microsoft%20SQL%20Server%20Reporting%20Services&ProdVer=11.0.3128.0"
xmlns:msrs="http://www.microsoft.com/sql/reportingservices">Cannot create a connection to data source 'EntityDataSource'.</Message><MoreInformation><Source></Source><Message>For more information about this error navigate
to the report server on the local server machine, or enable remote errors</Message></MoreInformation></MoreInformation></MoreInformation><Warnings xmlns="http://www.microsoft.com/sql/reportingservices" /></detail>
Please help me to solve this issue. I don't know if uploading the excel workbook is enough or maybe It needed to connect to another data source.
I Appreciate in advance.Hi Ali.y,
Based on the current error message, the error can be related to the
Claims to Windows Token Service (C2WTS) and is an expected error under certain conditions. To verify the issue, please check the aspects below:
1. The C2WTS Windows service and C2WTS SharePoint service are both running.
2. Check the SQL Server Browser service is running on the machine that has the PowerPivot instance of SSAS.
3. Check the domain. You're signing into SharePoint with a user account in some domain (call it Domain A). When Domain A is equal to Domain B which SharePoint server itself is located (they're the same domain), or Domain
A trusts Domain B.
In addition, the error may be caused by Kerberos authentication issue due to missing SPN. In order to make the Kerberos authentication work, you need to configure the Analysis Services to run under a domain account, and register the SPNs for the Analysis
Services server.
To create the SPN for the Analysis Services server that is running under a domain account, run the following commands at a command prompt:
• Setspn.exe -S MSOLAPSvc.3/Fully_Qualified_domainName OLAP_Service_Startup_Account
Note: Fully_Qualified_domainName is a placeholder for the FQDN.
• Setspn.exe -S MSOLAPSvc.3/serverHostName OLAP_Service_Startup_Account
For more information, please see:
How to configure SQL Reporting Services 2012 in SharePoint Server 2010 / 2013 for Kerberos authentication
Regards,
Heidi Duan
Heidi Duan
TechNet Community Support
Maybe you are looking for
-
Mac Pro 3.1 eight core FCP 6 shows all 8 cores rendering??
I just got a Mac Pro early 2008 3.1 with eight cores...running Leopard 10.5.8 I've read on this forum at times that FCP does not use more than 4 cores when rendering. But when I render 1080p pro res 442 in FCP 6 all eight cores run up over 90%, an
-
IWeb 2.0 It a ball.. it's fun.. NOT
This thing ate my older domain file (working with a copy).. when it opened everything looked intact... images on blogs were not the same size but everything else seemed in order. So I decided to publish to folder... not 30 seconds into publishing to
-
HT3702 How can I talk to a customer service agent?
Need to talk to a customer service agent about fraudulent charges.
-
Is Firefox 3.0.6 available on arch?
Is there an upgrade for firefox 3.0.6? I know arch uses grandparadiso. Where can I get it? Thanks
-
IPhone 4s locked up after trying to update to ios 7
I created a backup file in iTunes, then I updated my iPhone 4s to ios 7 on a Windows 8 computer while connected to the iTunes software. Now the iPhone is stuck on an iTunes/USB image and iTunes does not recognize that the phone is connected so I can'