Connect By with too many rows
Hello,
I have a table of contracts. Each can have different intervalls of payment: M monthly, Q quaterly, Y yearly or E single payment.
Now I want a select that shows all dates from begin to maturity (duration) when the collection of outstanding accounts takes place.
DROP TABLE contract;
CREATE TABLE contract (
nr NUMBER (5)
,dat_begin DATE
,invoiced VARCHAR(1)
,duration NUMBER(5)
INSERT INTO contract (nr,dat_begin,invoiced,duration)
VALUES (345,TO_DATE('01.01.2008','dd.mm.yyyy'),'M',1);
--INSERT INTO contract (nr,dat_begin,invoiced,duration)
-- VALUES (456,TO_DATE('01.01.2008','dd.mm.yyyy'),'Q',2);
INSERT INTO contract (nr,dat_begin,invoiced,duration)
VALUES (567,TO_DATE('01.01.2007','dd.mm.yyyy'),'Y',1);
--INSERT INTO contract (nr,dat_begin,invoiced,duration)
-- VALUES (678,TO_DATE('01.01.2008','dd.mm.yyyy'),'E',2);
WITH
first_month AS(
SELECT TRUNC(dat_begin,'MM') AS first_date
,nr
,DECODE (invoiced
,'M',1
,'Q',3
,'Y',12
,'E',1000
) AS intervall
,duration
FROM contract
SELECT ADD_MONTHS(first_date,(LEVEL - 1) * intervall) AS all_dates
,nr
,duration
,intervall
,CONNECT_BY_ROOT nr
FROM first_month
CONNECT BY LEVEL <= duration * 12 / intervallNow I expect to get 12 dates for contract 345 and one for 567. But...
ALL_DATE NR DURATION INTERVALL LEVEL CONNECT_BY_ROOTNR
01.01.08 345 1 1 1 345
01.02.08 345 1 1 2 345
01.03.08 345 1 1 3 345
01.04.08 345 1 1 4 345
01.05.08 345 1 1 5 345
01.06.08 345 1 1 6 345
01.07.08 345 1 1 7 345
01.08.08 345 1 1 8 345
01.09.08 345 1 1 9 345
01.10.08 345 1 1 10 345
01.11.08 345 1 1 11 345
01.12.08 345 1 1 12 345
01.01.07 567 1 12 1 567
01.02.08 345 1 1 2 567
01.03.08 345 1 1 3 567
01.04.08 345 1 1 4 567
01.05.08 345 1 1 5 567
01.06.08 345 1 1 6 567
01.07.08 345 1 1 7 567
01.08.08 345 1 1 8 567
01.09.08 345 1 1 9 567
01.10.08 345 1 1 10 567
01.11.08 345 1 1 11 567
01.12.08 345 1 1 12 567every row after row 13 '01.01.07 567' I didn't expect.
Of course I can add a predicate CONNECT_BY_ROOT = NR. But I would like to understand why I get 11 rows for nr 345 with root 567.
Regards
Marcus
Hi,
In a counter sub-query, where you use "CONNECT BY LEVEL < x" to generate the integers 1, 2, ..., x, the table
in the FROM-clause must have only one row.
Do this:
WITH
first_month AS(
SELECT TRUNC(dat_begin,'MM') AS first_date
,nr
,DECODE (invoiced
,'M',1
,'Q',3
,'Y',12
,'E',1000
) AS intervall
,duration
,cntr.n
FROM contract
cntr AS
( -- Begin counter sub-query
SELECT LEVEL AS n
FROM dual
CONNECT BY LEVEL <=
( -- Begin scalar sub-query to get max range
SELECT MAX (duration * 12 / intervall)
FROM first_month
) -- End scalar sub-query to get max range
) -- End counter sub-query
SELECT ADD_MONTHS(first_date,(cntr.n - 1) * intervall) AS all_dates
,nr
,duration
,intervall FROM first_month
JOIN cntr ON cntr.n <= duration * 12 / intervall
;As you can see, this solution is very similar to Blushadow's.
"CONNECT BY" implies a parent-child relationship.
When the CONNECT BY condition does not refer to any values stored in the table, as in "CONNECT BY LEVEL <= x",
then every row in the table will be considered the parent of every other row. So if you have J rows in the table,
you will have J rows at LEVEL=1, J*J rows at LEVEL=2, J*J*J rows at LEVEL=3, ..., POWER (j, k) rows at LEVEL=k.
I don't think the query you posted produced the results you posted.
The query has five columns in the SELECT-clause, but the result set has six.
What is the purpose of the last column, with CONNECT_BY_ROOT? If it's supposed to be the same as nr,
then you can have another column containing nr.
Similar Messages
-
Result set does not fit; it contains too many rows
Dear All,
We are in BI7 and running reports on Excel 2007. Even though number of rows limitation in Excel 2007 is more than 1Million, when I try to execute a report with more than 65k records of output, system is generating output only for 65k rows with message "Result set does not fit; it contains too many rows".
Our Patch levels:
GUI - 7.10
Patch level is 11
Is there any way to generate more than 65000 rows in Bex?
Thanks in advance...
regards,
Raju
Dear Gurus,
Could you please shed some light on this issue?
thanks and regards,
Raju
Edited by: VaraPrasadraju Potturi on Apr 14, 2009 3:13 AMVara Prasad,
This has been discussed on the forums - for reasons of backward compatibility I do not think BEx supports more that 65000 rows .... I am still not sure about the same since I have not tried out a query with more that 65K rows on excel 2007 but I think this is not possible... -
List of Value: Best practice when there are too many rows.
Hi,
I am working in JDev12c. Imagine the following scenario. We have an employee table and the organization_id as one of its attributes. I want to set up a LOV for this attribute. For what I understand, if the Organization table contains too many rows, this will create an extreme overhead (like 3000 rows), also, would be impossible to scroll down in a simple LOV. So, I have decided the obvious option; to use the LOV as a Combo Box with List of Values. Great so far.
That LOV will be use for each user, but it doesn't really depend of the user and the list of organization will rarely change. I have a sharedApplicationModule that I am using to retrieve lookup values from DB. Do you think would be OK to put my ORGANIZATION VO in there and create the View Accessor for my LOV in the Employees View?
What considerations should I take in term of TUNING the Organization VO?
RegardsHi Raghava,
as I said, "Preparation Failed" may be (if I recall correctly) as early as the HTTP request to even get the document for indexing. If this is not possible for TREX, then of course the indexing fails.
What I suggested was a manual reproduction. So log on to the TREX host (preferrably with the user that TREX uses to access the documents) and then simply try to open one of the docs with the "failed" status by pasting its address in the browser. If this does not work, you have a pretty good idea what's happening.
Unfortunately, if that were the case, this would the be some issue in network communications or ticketing and authorizatuions, which I can not tell you from here how to solve.
In any case, I would advise to open a support message to SAP - probably rather under the portal component than under TREX, as I do not assume that this stage of a queue error has anything to do with the actual engine.
Best,
Karsten -
After my friend locked me out of my iPhone with too many wrong password attempts I restored my iPhone via recovery mode and now it is showing that picture of a cable leading towards itunes, and itunes says the sim is not supported but it is the original sim. How do I fix this?
Something went wrong with the update, this can and does happen with every version of iOS.
There is nothing wrong with the update.
Simply restore the device via iTunes on the computer.
If iTunes is stating it will take hours to update, that indicates an extremely slow Internet connection and is likely the reason the OTA update failed. -
Exception too many rows...
Hi
I am getting two different outputs with following code depending upon i declare the variable in first or second way...
when i declare the variable v_empno as number(10) and too many rows exception is raised....and after that i dbms this variable..it is null...
but when i declare the same variable as table.column%type....and the similar scenario happens and i dbms the value of variable...it is not null...rather the first value from output of the query...
declare
--v_empno number(10);
v_empno emp.empno%type;
begin
dbms_output.put_line('before '||v_empno );
select empno into v_empno from emp;
dbms_output.put_line('first '||v_empno);
exception when too_many_rows then
dbms_output.put_line('second '||v_empno);
dbms_output.put_line('exception'||sqlerrm);
end;
is there any specific reason for this....
ur comments plz
Thanks
SidhuIn 9i:
SQL> declare
2 --v_empno number(10);
3 v_empno emp.empno%type;
4 begin
5 dbms_output.put_line('before '||v_empno );
6 select empno into v_empno from emp;
7 dbms_output.put_line('first '||v_empno);
8 exception when too_many_rows then
9 dbms_output.put_line('second '||v_empno);
10 dbms_output.put_line('exception'||sqlerrm);
11 end;
12 /
before
second 7369
exceptionORA-01422: exact fetch returns more than requested number of rows
PL/SQL procedure successfully completed.
SQL> declare
2 v_empno number;
3 --v_empno emp.empno%type;
4 begin
5 dbms_output.put_line('before '||v_empno );
6 select empno into v_empno from emp;
7 dbms_output.put_line('first '||v_empno);
8 exception when too_many_rows then
9 dbms_output.put_line('second '||v_empno);
10 dbms_output.put_line('exception'||sqlerrm);
11 end;
12 /
before
second
exceptionORA-01422: exact fetch returns more than requested number of rows
PL/SQL procedure successfully completed.
SQL> edit
Wrote file afiedt.buf
1 declare
2 v_empno number(10);
3 --v_empno emp.empno%type;
4 begin
5 dbms_output.put_line('before '||v_empno );
6 select empno into v_empno from emp;
7 dbms_output.put_line('first '||v_empno);
8 exception when too_many_rows then
9 dbms_output.put_line('second '||v_empno);
10 dbms_output.put_line('exception'||sqlerrm);
11* end;
SQL> /
before
second 7369
exceptionORA-01422: exact fetch returns more than requested number of rows
PL/SQL procedure successfully completed.In 10G:
SQL> declare
2 v_empno number(10);
3 --v_empno emp.empno%type;
4 begin
5 dbms_output.put_line('before '||v_empno );
6 select empno into v_empno from emp;
7 dbms_output.put_line('first '||v_empno);
8 exception when too_many_rows then
9 dbms_output.put_line('second '||v_empno);
10 dbms_output.put_line('exception'||sqlerrm);
11 end;
12 /
before
second 7369
exceptionORA-01422: exact fetch returns more than requested number of rows
PL/SQL procedure successfully completed.
SQL> edit
Wrote file afiedt.buf
1 declare
2 v_empno number;
3 --v_empno emp.empno%type;
4 begin
5 dbms_output.put_line('before '||v_empno );
6 select empno into v_empno from emp;
7 dbms_output.put_line('first '||v_empno);
8 exception when too_many_rows then
9 dbms_output.put_line('second '||v_empno);
10 dbms_output.put_line('exception'||sqlerrm);
11* end;
SQL> /
before
second 7369
exceptionORA-01422: exact fetch returns more than requested number of rows
PL/SQL procedure successfully completed.
SQL> edit
Wrote file afiedt.buf
1 declare
2 --v_empno number;
3 v_empno emp.empno%type;
4 begin
5 dbms_output.put_line('before '||v_empno );
6 select empno into v_empno from emp;
7 dbms_output.put_line('first '||v_empno);
8 exception when too_many_rows then
9 dbms_output.put_line('second '||v_empno);
10 dbms_output.put_line('exception'||sqlerrm);
11* end;
SQL> /
before
second 7369
exceptionORA-01422: exact fetch returns more than requested number of rows
PL/SQL procedure successfully completed.Anyhow you should not rely on the fact Oracle fetches the first value into variable
and keeps it when the excaprion is raised.
Tom Kyte discusses the SELECT INTO issue here:
http://asktom.oracle.com/pls/ask/f?p=4950:8:7849913143702726938::NO::F4950_P8_DISPLAYID,F4950_P8_CRITERIA:1205168148688
Rgds. -
Can't connect to Server - "too many users" after 10.8 install
After upgrading my Retina MBP, my Air, and my mid-2011 iMac (which acts as a file server) to Mountain Lion, I find I cannot consistently connect to the iMac server. Everything is done over an Airport Extreme wifi network, and there were no issues in 10.7. It usually happens 24-48 hours after a retstart, and a restart will fix it though that is inconvenient. The error given on the machine trying to connect in is "too many users" on the host (the iMac server). Any thoughts?
Me too... similar setup. MBPr, MBP and an old MB with a Mini/drobo as a file server. A reboot fixes it, but that's a pain if I'm not home.
I have noticed that I can usually share the screen even if I can't mount a drive, which helps, but still just another hurtle. -
WUT-113 Too many rows when using Webutil to upload a doc or pdf
Hi All,
I am using Webutil to upload and view files and am receiving the error msg 'WUT-113 Too many rows matched the supplied where clause'
The process works fine for uploading photos but not when I try and upload documents and pdf files.
I am on v 10.1.2.0.2 and using XP.
The code example (for documents) is outlined below but think that the issue must be to do with some sort of incorrect Webutil configuration
on my machine.
declare
vfilename varchar2(3000);
vboolean boolean;
begin
vfilename := client_get_file_name('c:\',file_filter => 'Document files (*.doc)|*.doc|');
vboolean := webutil_file_transfer.Client_To_DB_With_Progress
( vfilename,
'lobs_table',
'word_blob',
'blob_id = '||:blob_id,
'Progress',
'Uploading File '||vfilename,
true,
'CHECK_LOB_PROGRESS');
end;
Any assistance in this matter would be appreciated.
kind regards,
TomHi Sarah,
Many thanks for the reply.
All I'm trying to do is to click on the Browse Document button in a form to upload
a document (in this example) from my machine and save it to the db table called lobs_table
using the webutil_file_transfer.Client_To_DB_With_Progress program.
When I first access the form the field :blob_id is populated (by a When-create-Record trigger)
with a value made up of sysdate in NUMBER format as DDMMHHMISS e.g. 0106101025
When I press 'Browse Document' - a button in the form) and the dialog box is opened and
I select a document and click ok then I see the error message 'WUT-113 Too many rows matched the supplied where clause' and yet the where clause element of the call to webutil_file_transfer.Client_To_DB_With_Progress program
should be the :blob_id value ('blob_id = '||:blob_id) i.e. should be a single value populated in the field when I first access the form - so why am I seeing the when too many rows error?
I may be missing something obvious as I've only just started using Webutil.
Kind regards,
Tom -
I have two data blocks, one data block joins two tables and second datablock is based on one table.
first datablock has all fields with 1:1 relationship with Packing_id and second data block details has multiple rows
for every Packing_id. I wrote 2 procs for 2 datablocks are called in respective Post-Query trigger.
My problem is when I am running forms it gives error Message('too many rows found_orders_begin');
Here are my codes.
PROCEDURE post_query IS
CURSOR mast_cur IS
SELECT pa.ship_to_last_name,
pa.ship_to_first_name,
pa.ship_to_address1,
pa.ship_to_address2,
pa.ship_to_city,
p.packing_id,
FROM packing_attributes pa,packing p
WHERE p.packing_id ; = pa.packing_id
AND p.packing_id ; = :PACKING_JOINED.PACKING_ID;
BEGIN
Message('too many rows found_orders_begin');
OPEN mast_cur;
loop
FETCH mast_cur INTO :PACKING_JOINED.SHIP_TO_LAST_NAME,
:PACKING_JOINED.SHIP_TO_FIRST_NAME,
:PACKING_JOINED.SHIP_TO_ADDRESS1,
:PACKING_JOINED.SHIP_TO_ADDRESS2,
:PACKING_JOINED.SHIP_TO_CITY,
:PACKING_JOINED.PACKING_ID,
end loop;
CLOSE mast_cur;
EXCEPTION
WHEN too_many_rows THEN
Message('too many rows found');
WHEN no_data_found THEN
Message('no data was found there');
WHEN OTHERS THEN
Message('do something else');
END post_query;
Detail proc
PROCEDURE post_query IS
CURSOR det_cur IS
SELECT pd.quantity,
pd.stock_number,
FROM packing_details pd,packing p
WHERE p.packing_id ; = pd.packing_id
AND pd.packing_id = :PACKING_JOINED.PACKING_ID;
BEGIN
Message('too many rows found_pack_begin');
OPEN det_cur;
FETCH det_cur INTO
:DETAILS.QUANTITY,
:DETAILS.STOCK_NUMBER,
CLOSE det_cur;
EXCEPTION
WHEN too_many_rows THEN
Message('too many rows found');
WHEN no_data_found THEN
Message('no data was found there');
WHEN OTHERS THEN
Message('do something else');
END post_query;
Thanks in advance for your help.
SandyThanks for reply.
Maybe it gives this message because you have programmed to show this message ?
I intentionally gave this message to see how far my code is working,if I don't give this message and execute query I get FRM-41050:You cannot update this record.
Even though I am not updating record(I am querying record) and data block UPdate Allowed is set to NO.
Some additional comments on your code:
What is the loop supposed to do? You just fill the same fields in forms repeating with the values of your cursor, so after the loop the last record from your query will be shown. In general, in POST-QUERY you read Lookup's, not details.
Sorry but I have no idea how to show detail records,thats why i tried with loop. In first proc I will have only 1 row returned so I guess I don't need loop in that proc?
In second there will be multiple rows for one packing_id(packing_id is common column for both block), please let me know how to do that?
Your exception-handler for NO_DATA_FOUND and TOO_MANY_ROWS are useless, for these errors cannot be raised using a cursor-for-loop
I will remove these. Thanks
Sandy
Edited by: sandy162 on Apr 2, 2009 1:28 PM -
Can you have too many rows in a table?
How many rows would you consider to be too many for a single table and how would you re-arrange the data if
asked?
any answers?
sukaiI have some tables with over 100 million rows that still perform well, and I'm sure much larger tables are possible. The exact number of rows would vary significantly depending on a number of factors including:
Power of the underlying hardware
Use of the table – frequency of queries and updates
Number of columns and data types of the columns
Number of indexes
Ultimately the answer probably comes down to performance – if queries, updates, inserts, index rebuilds, backups, etc. all perform well, then you do not yet have too many rows.
The best way to rearrange the data would be horizontal partitioning. It distributes the rows into multiple files; which provides a number of advantages including the potential to perform well with larger number of rows.
http://msdn.microsoft.com/en-us/library/ms190787.aspx -
I have been blocked from my own ITunes songs because my apple ID associated with too many devices
I cannot access songs from my own iTunes Apple ID because it says my iPhone has been associated with too many devices. I have to wait 90 days to access my own songs on my iPhone. Here is what happend:
1-Our family moved. So, with the address change, the credit cards on all of the apple ID's would no longer work because the other users had our old address. (I have 4 users in the family).
2-My family started calling me while I was out of town complaining that purchases were rejected. This was because our address had changed and it no longer matched.
3-I tried to walk them thru the process of changing the home address on the credit card, but they couldn't do it. They assured me that they were doing it correctly and were still rejected.
4-So, being the techie of the family, I simply logged into each of their iTunes accounts from my iPhone while I was out of town and made the address change. While I was in their accounts, I noticed they didn't have automatic downloads and iTunes match and stuff. So, I changed it so it would work for them when they logged back in.
5-I go back to log into my own account on my iphone and I cannot access my own songs on the iTunes cloud for 90 days!!!! This is quite disappointing.
How can I get this corrected? I tried to reset user settings (not a erase and reset). I figure it's not my phone, but my iTunes account thinks I'm doing something sharing-wise.
So, in summary, when I am on my iPhone and I go to my iTunes library and click on a song (that I have paid for) with the cloud and down arrow beside it, it's rejected. It says that I have too many apple id's associated with this device and I must wait 90 days. It counts down each day.
I just want access to the music I paid FOR!!Thank you-I will contact support.
No, I cannot wait 90 days.
Let me restate what I have to endure, even though I own the music and have paid for iTunes Match:
I cannot access my purchases-zero-at all
I have to listen to never ending McDonald's latte commercials and progressive insurance. I have now banned both of those annoying businesses and will not make purchases there.
In summary, I have given apple my minded, and have zero to show for it.
Ps-you get to hear the McDonald's "love hate commercial" every 2 songs on iTunes Radio. I have substituted the words where I hate McDonald's and love when the commercial is over. -
Tablespace with too many extents are evil for performance
I think the tablespace with too many extents are not bad for performance after introduction of LMT tablespace?? May be true with DMT tablespace(earlier versions). This is just observation after reading the LMT vs DMT tablespaces.
Experts - Please comment on your thoughts.Indeed, I work in an environment where there is over 50 databases to be administered and so we have lots of DBAs interacting with each other.
I'm stunned by this myth of "reorganization". Most of the DBAs move tables + rebuild indexes regularly generating huge redo on a monthly basis. Also provoking indexes to do all the splits again generating even more redo. They claim "it helps performance a lot" however not one is able to quantify and quite justify it other than "less extents less I/O, good". Even when I bring up the existence of shrink they say "do not like it, prefer the classic move". People really have a way of holding on to their good ol' practices of Oracle 8i.
For full table scans (which should never be done on a OLTP scenario) this extent issue would be relevant IF data on the table is the victim of large deles and Oracle hasn't re-used that space yet. If your multiblock reads is a multiple of your extent size, than there won't be any overhead of I/O call, no matter the number of your extents. For OLTP this is not relevant because Oracle will access the table via ROWID.
I rarely have ever seen an index benefit from a rebuild significantly. In my experience what people often understand as "index fragmentation" is often just an unoptimized execution plan due to cardinality issues where oracle ends up fetching a large percentage of the table via single reads on that index. -
Please help me I don't know what is this mean : Your request cannot be completed because your Apple ID has been associated with too many credit cards.
And you tell me what to do?New one to me, you might try contacting Apple through iTunes Store Support
-
TOO MANY ROWS error experienced during DBMS_DATA_MINING.APPLY
On a test database that is running 10.1.0.2 when I execute APPLY specifying a data_table_name from another schema [data_schema_name], I get an ORA-1422, too many rows fetched.
However, when I exercise this same option [data table in another schema] on version 10.1.0.3, I get no error and the APPLY works.
Could someone tell me if this is an error that was fixed between the two versions and point me to the documentation that supports this?
Thanks,
DiannaDianna,
The behavior you observed could be as a result of some of the security bug fixes that our group have back ported to 10.1.0.3 patchset release. ST releases security bundle patches periodically and some of them were back ported to earlier releases.
There are no specific documentation regarding security bundle patch contents for security reasons.
Regards,
Xiafang -
SQL subquery returning too many rows with Max function
Hello, I hope someone can help me, I been working on this all day. I need to get max value, and the date and id where that max value is associated with between specific date ranges. Here is my code , and I have tried many different version but it still returning
more than one ID and date
Thanks in advance
SELECT
distinctbw_s.id,
avs.carProd,cd_s.RecordDate,
cd_s.milkProductionasMilkProd,
cd_s.WaterProductionasWaterProd
FROMtblTestbw_s
INNERJOINtblTestCpcd_sWITH(NOLOCK)
ONbw_s.id=cd_s.id
ANDcd_s.recorddateBETWEEN'08/06/2014'AND'10/05/2014'
InnerJoin
(selectid,max(CarVol)ascarProd
fromtblTestCp
whererecorddateBETWEEN'08/06/2014'AND'10/05/2014'
groupby
id)avs
onavs.id=bw_s.id
id RecordDate carProd MilkProd WaterProd
47790 2014-10-05 132155 0 225
47790 2014-10-01 13444 0 0
47790 2014-08-06 132111 10 100
47790 2014-09-05 10000 500 145
47790 2014-09-20 10000 800 500
47791 2014-09-20 10000 300 500
47791 2014-09-21 10001 400 500
47791 2014-08-21 20001 600 500
And the result should be ( max carprod)
id RecordDate carProd MilkProd WaterProd
47790 2014-10-05 132155 0 225
47791 2014-08-21 20001 600 500Help your readers help you. Remember that we cannot see your screen, do not know your data, do not understand your schema, and cannot test a query without a complete script. So - remove the derived table (to which you gave the alias "avs")
and the associated columns from your query. Does that generate the correct results? I have my doubts since you say "too many" and the derived table will generate a single row per ID. That suggests that your join between the first
2 tables is the source of the problem. In addition, the use of DISTINCT is generally a sign that the query logic is incorrect, that there is a schema issue, or that there is a misunderstanding of the schema. -
Aggravations with Too Many Activations
Okay... here's the story (I'll try and keep it short... but no promises).
In 2000 I opened account # 1 with Adobe.
In 2004 I must have forgotten about my previous account so I opened a second under a different email address.
I had purchased some ebooks last year, and again this year in March using ADE to read them. I believe it was under account # 2, but it could have been under # 1. The computer was already activated when I imported the latest book.
I had to reinstall Windows for the usual Windows issues. But when I tried reading my ebooks, I kept getting error messages that the book was activated under another user, or that I should redownload the book (which is not an option). Neither account is working with the ebooks. And of course I ran out of activations on both accounts.
Case # 0202264241: General request for help with a summary. Opened 07-14-2009. No response yet.
Case # 0181091688: A request to reset activations on account # 1. Opened 08-02-2009. Case withdrawn by Adobe Support on 08-05-2009.
Response: "I understand that you are receiving the message about 'too many activations' We do not provide support for this service through this channel." The message then proceeded to instruct me to open a case through the same support web portal I had used to open the case that they were reading. Also the instructions they gave to open the new case were incorrect. However, there was a phone number on the bottom of the message. After 30 minutes and repeated requests to provide my serial number (?) for Digital Editions, I was told that voice support was not provided for free software, and I should open a case on the support web portal.
Case # 0181091742: A request to reset activations on account #2. Opened 08-02-2009. Case withdrawn by Adobe Support on 08-05-2009.
Response: "Unfortunately we are unable to assist with technical support issues beyond installation for Adobe Digital Editions via web support, and
there is no phone support at all for Adobe Digital Editions. For support information for Adobe Digital Editions please refer to the Adobe Flash Player Support Center, available at the URL below:
http://forums.adobe.com/community/adobe_digital_editions"
So here I am at the... um... Adobe Flash Player Support Center?
I've noticed that there are several Adobe staff who read these forums, and they appear to be very helpful and responsive. If any of you are reading this, please help!!!! If I get directed to the support portal again I'm going to have a nervous breakdown!
Thank you,
DavidHello
I need also help.
The same Probleme here and can´t find any help.
Please let me know when you found a solution for this.
my Mail Adress: [email protected]
best regards
Michael
Maybe you are looking for
-
How can I set up mail in OSX.1?
I have an old iMac flatpanel that I have just repaired and have installed OSX.1.5 on it. How can I set mac mail up to work on it?
-
Can I install windows xp in bootcamp and parelles - same CD or will ther be a piracy problem?
-
Assignment of purchse organisation to site (plant) in IS-Retail
Dear Friends, Would like to have simple Query We Have 3 Company Codes. Is we need to have different purchase organisations, Company-Code wise and purchase organisations are assigned to sites Or is it possible to have Centralised purchase Organisation
-
Hi, I created one portlet and assigned to a remote server. the remote server pointing to jsp which displays the flash movie. i am able to play the flash movie if i access the jsp directly from the web server, but i am not able to see the flash movie
-
I have an ipad mini and the wireless Logitech keyboard and I want to know if the Logitech case will fit both my ipad and the keyboard ?