Luna II + Firewire interface

An area for people to discuss Scope related problems, issues, etc.

Moderators: valis, garyb

zircon
Posts: 34
Joined: Wed Jan 17, 2007 11:40 pm

Post by zircon »

Really...? This is turning out to be something of a headache...
User avatar
garyb
Moderator
Posts: 23374
Joined: Sun Apr 15, 2001 4:00 pm
Location: ghetto by the sea

Post by garyb »

scope is hardware. how can you render hardware?
hubird

Post by hubird »

...
zircon
Posts: 34
Joined: Wed Jan 17, 2007 11:40 pm

Post by zircon »

gary; maybe I assumed it functioned like PowerCore or UAD cards? Is that an unreasonable assumption to make, given that their design is extremely similar? Oy. I'm not sure this card is really going to fit into my workflow at this rate. Perhaps I should sell it to someone who would get more use out of it.
arela
Posts: 858
Joined: Tue Aug 27, 2002 4:00 pm
Location: Norway

Post by arela »

so I could access plugins like PSY-Q and Optimaster. I don't want to use the card for my normal audio stuff - just XTC plugin mode
Maybe this would be a lot easyer without using xtc.
You might use Scope together with eg Cool Edit or Audition (pre 2.0) together with your other system.

I'm not shure, but this is what i think at least?
User avatar
garyb
Moderator
Posts: 23374
Joined: Sun Apr 15, 2001 4:00 pm
Location: ghetto by the sea

Post by garyb »

uad or tc plugins can't be rendered either. they require "real time" export just like scope plugins. it seems that you are married to a "workflow" that is not that knowledgable about gear(that's no crime). expanding your world might be a good thing in the long run........
zircon
Posts: 34
Joined: Wed Jan 17, 2007 11:40 pm

Post by zircon »

What are you talking about, dude? I own a Powercore Element. It integrates flawlessly - you just open the plugins as VSTs in your sequencer and they render normally like any other VST plugin. I don't have a UAD-1 but even a cursory google search reveals plenty of users explaining how the instantiated plugins are used like any other VST. In fact, I've seen posts specifically recommending you *don't* use real time rendering for UAD cards, and that non-real time is preferable! Of course there seems to be debate on this topic, but there is no question that the UAD plugs CAN be rendered normally.

So again, it's really not an unreasonable assumption I made. And frankly the inability for Creamware plugins to be properly integrated & rendered like normal VSTs is not something to be defended.
User avatar
garyb
Moderator
Posts: 23374
Joined: Sun Apr 15, 2001 4:00 pm
Location: ghetto by the sea

Post by garyb »

hmmm,
ok, it's possible that you are right. how about this, try a non-realtime rendering. if the plugin is rendered on the dsp and not the cpu, then offline is not possible as offline computations are done by the cpu. dsps are realtime processors, they don't function like a cpu does. the processes that run on them are different. rendering must be done in real time, to pass the info on to the system and render it. it's not a big issue though. a 4 minute song only takes a mere 4 minutes.
zircon
Posts: 34
Joined: Wed Jan 17, 2007 11:40 pm

Post by zircon »

Alright - so really I should be looking at this card more like a piece of outboard gear with a 100% software interface. Like a multi-effects box with some built in DSP. That makes a lot more sense.

The routing on the SCOPE platform is still confusing to me. Can you walk me through routing audio from one source through PSY-Q and OptiMaster, then recording it to Audacity? Let's say I already have the rendered WAV that is unmastered.
hubird

Post by hubird »

just load the ASIO Source and Destination modules (choose the number of ASIO channels in the source module to your needs), connect them to a mixer of your choice (Source to inputs, mix-out or monitor-out to Destination).
load the inserts you need.

Then launch your sequencer program.

It should 'see' the available ASIO channels then in the appropriate location of your sequencer, check the 'audio menu' or something.

Route the output of each audio track to one of those ASIO-out channels.
Route the ASIO input (the Scope mixer's out actually) to a track to record to, mute it to prevent feedbacking :-)

Start rendering.
Pardon, start recording realtime :-D
arela
Posts: 858
Joined: Tue Aug 27, 2002 4:00 pm
Location: Norway

Post by arela »

The routing on the SCOPE platform is still confusing to me. Can you walk me through routing audio from one source through PSY-Q and OptiMaster, then recording it to Audacity? Let's say I already have the rendered WAV that is unmastered.
Simple direct routing:
ASIO2 > PSYQ > Optimaster > Wave Dest & (eg Analog Out for monitoring)

Remember : 1 output can be connected to 1, 2 or more inputs
Post Reply