diff options
author | Steve Reinhardt <stever@eecs.umich.edu> | 2004-06-15 10:48:08 -0700 |
---|---|---|
committer | Steve Reinhardt <stever@eecs.umich.edu> | 2004-06-15 10:48:08 -0700 |
commit | d53c6c168afa7eafdf8c4fa10a10f835db25e3da (patch) | |
tree | c001ef81c09fb8e52afb64df5b2a5a231fe7a352 /arch/alpha/isa_desc | |
parent | 7b24ae00dc2b3f503a15c28a7728cfc9a3e9299f (diff) | |
download | gem5-d53c6c168afa7eafdf8c4fa10a10f835db25e3da.tar.xz |
Get software prefetching to work in full-system mode.
Mostly a matter of keeping prefetches to invalid addrs
from messing up VM IPRs. Also discovered that wh64s were
not being treated as prefetches, when they really should be
(for the most part, anyway).
arch/alpha/alpha_memory.cc:
arch/alpha/alpha_memory.hh:
- Get rid of intrlock flag for locking VM fault regs (a la EV5);
instead, just don't update regs on VPTE loads (a la EV6).
- Add NO_FAULT MemReq flag to indicate references that should not
cause page faults (i.e., prefetches).
arch/alpha/ev5.cc:
- Get rid of intrlock flag for locking VM fault regs (a la EV5);
instead, just don't update regs on VPTE loads (a la EV6).
- Add Fault trace flag.
arch/alpha/isa_desc:
- Add NO_FAULT MemReq flag to indicate references that should not
cause page faults (i.e., prefetches).
- Mark wh64 as a "data prefetch" instruction so it gets controlled
properly by the FullCPU data prefetch control switch.
- Align wh64 EA in decoder so issue stage doesn't need to worry about it.
arch/alpha/isa_traits.hh:
- Get rid of intrlock flag for locking VM fault regs (a la EV5);
instead, just don't update regs on VPTE loads (a la EV6).
base/traceflags.py:
- Add Fault trace flag.
cpu/simple_cpu/simple_cpu.hh:
- Pass MemReq flags to writeHint() operation.
cpu/static_inst.hh:
Update comment re: prefetches.
--HG--
extra : convert_revision : 62e466b0f4c0ff9961796270fa2e371ec24bcbb6
Diffstat (limited to 'arch/alpha/isa_desc')
-rw-r--r-- | arch/alpha/isa_desc | 11 |
1 files changed, 6 insertions, 5 deletions
diff --git a/arch/alpha/isa_desc b/arch/alpha/isa_desc index 9fee12485..6c5912c52 100644 --- a/arch/alpha/isa_desc +++ b/arch/alpha/isa_desc @@ -1023,7 +1023,7 @@ def LoadStoreBase(name, Name, ea_code, memacc_code, postacc_code = '', # and memory access flags (handled here). # Would be nice to autogenerate this list, but oh well. - valid_mem_flags = ['LOCKED', 'EVICT_NEXT', 'PF_EXCLUSIVE'] + valid_mem_flags = ['LOCKED', 'NO_FAULT', 'EVICT_NEXT', 'PF_EXCLUSIVE'] inst_flags = [] mem_flags = [] for f in flags: @@ -1072,7 +1072,7 @@ def format LoadOrPrefetch(ea_code, memacc_code, *pf_flags) {{ # Declare the prefetch instruction object. # convert flags from tuple to list to make them mutable - pf_flags = list(pf_flags) + ['IsMemRef', 'IsLoad', 'IsDataPrefetch', 'MemReadOp'] + pf_flags = list(pf_flags) + ['IsMemRef', 'IsLoad', 'IsDataPrefetch', 'MemReadOp', 'NO_FAULT'] (pf_header_output, pf_decoder_output, _, pf_exec_output) = \ LoadStoreBase(name, Name + 'Prefetch', ea_code, '', @@ -2369,9 +2369,10 @@ decode OPCODE default Unknown::unknown() { } format MiscPrefetch { - 0xf800: wh64({{ EA = Rb; }}, - {{ xc->writeHint(EA, 64); }}, - IsMemRef, IsStore, MemWriteOp); + 0xf800: wh64({{ EA = Rb & ~ULL(63); }}, + {{ xc->writeHint(EA, 64, memAccessFlags); }}, + IsMemRef, IsDataPrefetch, IsStore, MemWriteOp, + NO_FAULT); } format BasicOperate { |